Hacker News new | past | comments | ask | show | jobs | submit login
South Korea's fusion device KSTAR runs for 20 seconds at 100M degrees Celsius (arirang.com)
494 points by airstrike 52 days ago | hide | past | favorite | 310 comments

Sorry, but this, by itself, is a totally meaningless article. There might be an actual scientific advance here, but if so, I have absolutely no idea what it is. "100 million degrees" may sound impressive, but it's completely trivial to create a plasma at that temperature; plasma temperatures are typically quoted in electron-volts (eV) or kilo-electron-volts (keV), and a temperature like this probably corresponds to somewhere in the tens of keV. You can make a plasma like this in your garage, and run it as long as you like (there are guides on YouTube). But it won't produce net energy, ie., you won't get more energy out than you put in. To know how impressive this is, we need to know:

- Was net energy produced during this time?

- How much plasma was there, ie, how large of a scale was the experiment at?

- How dense was the plasma?

- How repeatable is the experiment? Could it be run again sustainably, or did it fry all the equipment with fast neutron radiation?

The article doesn't seem to mention any of these, and without them, "100 million degrees" is meaningless.

It's not meaningless if you know the context. Achieving 100M degrees is actually a significant milestone for this type of device. The reactor in the UK has this planned for somewhere in the next year. I don't know SPARC's schedule, maybe it's a bit more secretive but when they achieve 100M it will be in the news as well.

And even if you don't know the context, the article says it's the first in the world to achieve 100M for 20 seconds, good luck doing that in your microwave.

And the specific temperature of 100M degrees might seem arbitrary because it's such a funny sounding amount, but it is actually the precise temperature at which the plasma theoretically starts becoming net positive.

Whether it actually was net positive will obviously be the focus of the researchers for the next weeks, and even if they find the plasma generated energy consistently (which we'd surprised about if it didn't) it still doesn't mean the reactor was net positive as a whole because of inefficiencies.

They will not have fried their expensive reactor intentionally, so if they did that will be sad news. The scale of the experiment is clearly visible from the video. It's a room sized spherical tokamak style device. I'm pretty sure the density of the plasma inside those is a given but I'm not 100% on the physics there.

If I recall correctly, plasma density is a function of temperature and the strength of the containment field. So you’d need to know something about their (presumably YBCO) magnets to work that out.

I just google-translated one of Korean news article. It may help to understand their intention. I guess maintaining plasma for 300 seconds seems one of their milestones to commercial development.


KSTAR is setting the world record for ultra-high temperature plasma driving every year since raising the plasma temperature to 100 million degrees for the first time in 2018 and maintaining it for 1.5 seconds. [...] The fusion reactor aims to reach the stage of maintaining 100 million degrees of plasma at KSTAR for more than 300 seconds by 2025. "Most of the physical variables that act as obstacles to commercialization appear within 300 seconds after the plasma temperature rises to 100 million degrees." It means that it is no different from that.” It means that commercial development is possible only after reaching this stage. Director Yoon added, "We plan to maintain it until 30 seconds next year, and then achieve 50 seconds in 2023, 100 seconds in 2024, and 300 seconds in 2025 through facility upgrades."


Wait what? You can create something that runs at 100 million degrees in your garage?

Can you explain a bit more for the unlearned?

The misleading part about almost any mention of "millions of degrees" is that people implicitly expect this happens under earthly conditions, when in fact these values are typically just a reference to the speed of atoms in a rarified environment. Like the sun's corona, it may be very hot, but not very dense. In other words, there is not a lot of energy per volume, just a lot of energy per particle.

For example, a common way to reach "millions of degrees" is to pump the air out of a container to an extreme degree, then introduce trace amounts of a gas, and then accelerate or otherwise heat those few particles.

If this happened under normal pressures, there would be no container material that could withstand these temperatures. But in a near-vacuum, the collisions between the accelerated particles and the walls can be kept low. In an experimental fusion setting like this, they also employ a magnetic field to keep the plasma away from the container.

I'm not sure about the garage, but you can sure do it in your microwave - NileRed did a video about it: https://www.youtube.com/watch?v=l0u8Vtf2GoQ

Aaand he had to break all of his beakers afterwards: https://www.youtube.com/watch?v=tGqVMbAQhBs

The plasma you can generate in your microwave is nowhere near 100M degrees. The low range of plasma temperature is at around ~6K Celsius.

Such amount of plasma at 100M°K will destroy the whole building. Temperature of nuclear explosion is less than 1M°K.

Sure, here's a guide from Make Magazine:


Build a fusor running roughly at 10kV (Microwave transformers can do that if you modify them a bit, otherwise starter transformers from light fixtures or your car may be able to handle it).

That'll get you in the temperature range. You can also up the voltage to get more exciting effects, fusion is fairly easy to achieve here. Just not net positive fusion.

At the high end what you even mean with temperature gets a bit weird, everything is mostly a very rarefied gas.

That's a really silly statement.

I cant create 100 tonnes of nuclear waste in my garage either. However, it's not means for celebration when a giant institution does it.

As OP pointed out, with fusion -- net positive power output is important. Not the temperature achieved.

From TFA:

"No other fusion power plant in the world has managed to run for more than ten seconds"

WEST[1] (Formerly Tore Supra) holds the continuous operation of a fusion reactor record time at 6 minutes 30 seconds.

It's always frustrating where teams build fantastic projects but then make false claims that are propagated across the Internet. Don't get me wrong, 20 seconds for a new reactor is good, they'll do longer. After all, the French did in 2003.

[1] https://www.iter.org/proj/itermilestones#18

Perhaps the modes of operation were not comparable? Unless you really understand whats going on, i would not be calling it false claims.

However i do agree it is unclear

I believe the difference here is the temperature. If you look at KSTAR's operating tests you can see that they have run for 72 seconds in the past, so they must be implying 20 seconds at > 100M, or their statement about the 20s runtime would be invalidated by their own operating history.

What's more - the scientific article about the Tore Supra 2003 test is paywalled, but based on the abstract it looks like it was a test of: "simultaneously heat removal capability and particle exhaust in steady-state fully non-inductive current drive discharges" and not a test of maximum sustained temperatures.

[1] https://en.wikipedia.org/wiki/KSTAR [2] https://www.sciencedirect.com/science/article/abs/pii/S09203...

Maybe it's the specific word, "power plant," that qualifies the claim.

Here is a Wikipedia entry for this tokamak since the article isn’t very descriptive: https://en.m.wikipedia.org/wiki/KSTAR

Here to plug the excellent: Fusion Energy Base by Sam Wurzel (prev Founder/CEO of Octopart, early YC founder).

This is a great article on measuring fusion progress:


I feel like we need a SpaceX for Fusion, someone willing to fund and build a whole bunch of SPARC reactor prototypes of various sizes quickly, with the understanding they may or may not work.

The Nature Papers on ARC show it's almost sure to work. So it's rather disappointing it'll take 5 years to build SPARC, and then another 10 for ARC. Seems to me the primary problems are manufacturing and scaling components like REBCO. Someone like Musk would look at this and go "ok, first thing, let's build our own REBCO factory, and even mine the materials if necessary ourselves to get cost efficiency". They're still approaching this as "big science" project, where they don't want to fail, and they're gonna subsist on grants and as much off the shelf providers as they can.

Given the climate crisis, the government should be doing the same as they did with the CRS for space, award DOE contracts of $X billion for a working reactor, dole it out to people doing SPARC or Thorium Molten Salt (LIFTR) designs.

The innovation rate in Fusion is way too slow, we need to build and fail a lot more rather than sink $22 billion into multi-decade projects like ITER.

You're spot on, this is exactly the problem.

To use a physics analogy, gargantuan megaprojects like ITER develop their own selfgravitation that sucks in money and innovation, crushing engineers pitilessly under the weight of the bureaucracy.

To further abuse analogies, megaprojects suffer from something akin to the tyranny of the rocket equation: Because they're huge and expensive, they have to be broken up and doled out to disparate teams (countries even!), subcontractors, etc... Because there are many organisations involved, the friction of the interactions between force managers to plan ahead. Because planning ahead is required, only existing, established technologies can be used.

QED: It is not possible to do "innovation" with megaprojects, their mega size inherently prevents any possibility of true innovation occurring! The bigger they are, the less innovative they are.

I cannot emphasise this enough: Elements of the ITER project have been planned 30 years ahead! That's insanity. That's using 1990s technology in the 2020s! There is no possible path through ITER and then DEMO to achieve commercial power generation before 2050. None. There is not even a hope of such a thing.

Most critically, ITER used a legacy superconducting wire in their designs, which has a lower maximum magnetic field strength compared to more modern types. Because of the highly nonlinear scaling (quartic?) with increasing field strength, this is the fundamental limit to achieving break-even fusion, but they were forced to start planning without it.

They should have done what Tesla did: Focus on the batteries as the primary thing, everything else is secondary. ITER should have focused on the superconducting wires above all else for at least the first two decades of the project, before even thinking of actually opening a CAD program to design a Tokamak!

I think it's misleading to say ITERs technology is 30 years behind. A lot of the tech that has been going into the ITER design didn't exist 30 years ago! They didn't anticipate developments in super conductors but this is by no means a 90s machine built 30 years too late.

Part of the role of ITER was to fund the research to develop the reactor's components while giving everyone a concrete goal to work towards. The challenge with ITER is much more than construction.

Are you speaking from personal experience with ITER? You've said some damning things about projects in general but there's no backup for:

"crushing engineers pitilessly under the weight of the bureaucracy"

"Elements of the ITER project have been planned 30 years ahead"

"There is no possible path through ITER and then DEMO to achieve commercial power generation before 2050. None"

"ITER used a legacy superconducting wire in their designs"


> Because planning ahead is required, only existing, established technologies can be used.

You can't plan to use nonexisting technologies when that tech (superconductivity) was so very little understood.

Your criticisms seem unwarranted and unverifiable.

> You can't plan to use nonexisting technologies when that tech (superconductivity) was so very little understood.

That's exactly the problem. No one is allowed to fail so nothing is discovered.

And exactly how is superconductivity so very little understood? It's used in massive projects the world over from superconducting cables to maglev trains to numerous other things. And none of that has been figured out by ITER. ITER is the worst thing for fusion research in decades.

This is a ridiculous response.

> No one is allowed to fail so nothing is discovered.

2nd para in wiki article: "It [ITER] is an experimental tokamak nuclear fusion reactor" (https://en.wikipedia.org/wiki/ITER) It is an experiment, it was never expected to have certain outcomes.You don't seem to understand that.

> And exactly how is superconductivity so very little understood?

That's not what I said. Here "when that tech (superconductivity) was so very little understood"

Note: 'was'. Not 'is'. (see edit below)

As for the progression in their understanding I'll leave that for a physicist to discuss. You aren't.

Edit: see https://en.wikipedia.org/wiki/Superconductivity#High-tempera... for a list of recent stuff (eg. last 15 years). And stop shouting about how stupid everyone else is.

And ITER isn't using those newer superconductors. They're using outdated tech before it's even finished building.

> It is not possible to do "innovation" with megaprojects

I generally agree but I think the are examples of some successful projects that did innovate. You could probably consider Gemini + Apollo a single megaproject as they were designed to run consecutively, building towards the moon.

I think the key factor is not the size, but the type of management: Apollo, SpaceX (and the Manhattan project for that matter) all had technical leadership, and where headed by people who could basically have pulled the whole project off with a sufficient number of clones of themselves.

ITER is under MBA/Political management. Even if the management team had all the time in the world, they could not complete the project on their own.

Especially in ground-breaking projects, this makes a huge difference. With many options open, deciding on how to proceed requires cross-subsystem trade-offs and weighing many different risks and challenges against each other: things that will grind any non-technical leadership to a halt as they request piles upon piles of reports from their underlings.

I don’t think it is fair to say that Apollo could have been pulled off by a few kind of people. It was a massive undertaking spanning almost all fields of science and engineering. Just read a book like _Stages to Saturn_ to see what an undertaking it was to just conceive, design, implement and _manage_ the Saturn rockets.

But I’m more curious about your second statement. Could you back that up with some evidence? How is ITER’s management and leadership “MBA/Political”? Is it really different from the other projects’? Such a statement needs to be backed up with some evidence.

> But I’m more curious about your second statement. Could you back that up with some evidence? How is ITER’s management and leadership “MBA/Political”? Is it really different from the other projects’? Such a statement needs to be backed up with some evidence.

Maybe a bit conspiracy minded, but maybe the real purpose is to give nuclear physicists something to do, so that the brains to make nukes are still around in case we ever need to start doing that again?

This is absolutely an element at play at NIF. In fact not just the physics brains, but the experiments themselves, play a role in getting around test bans. To the point where nuke stewardship (ensuring aging warheads still will work) interferes with fusion physics.

if that the case, a totemak fusion reactor it's totally unrelated to how a nuclear bomb works.

Now imagine Apollo was planned as a gargantuan Warp 6-capable starship with 500-man crew instead of the several smaller rockets progressively building towards an achievable goal.

It would probably be more correct to say there is no incentive.

Gemini and Apollo had a nice one - the space race, more specifically, the fear of being bombed into oblivion from orbital stations without being able to do anything about it.

There's just not much incentive today, it makes business sense to use that funding as long as possible.

Did anyone said CERN?

The corrolary to your message is “don’t build mega-projects” which doesn’t make sense to me.

And no, Tesla didn’t start by selling batteries: they started by selling expensive cars which didn’t need the batteries at the same scale as mass-produced car; which kind of destroys the meaning of whole of your initial message.

Also, Tesla didn't start by using any fancy battery chemistries, they started by packaging cheap "shitty" commercial cells alongside enough monitoring and management hardware to keep them from catching fire or damaging themselves.

> The corrolary to your message is “don’t build mega-projects” which doesn’t make sense to me.

I think the two posters meant that size of the people working on a single thing is negatively correlated with innovation. The analogy was that Tesla is less ground breaking now compared to when it was experimental.

> The corrolary to your message is “don’t build mega-projects”

Mega projects are more optimized for delivery and leaving lesser things on luck and it is beneficial in most of the cases. But posters are saying given the vast scope of things that could be tried in fusion space, mega projects likely won't be touching those.

I don't think that's necessary the case.

Instead, you approach the megaproject as a series of prototypes, spending large sum of money in the hope of quick advancement and iteration cycle.

That already happened in the ITER development process with early version safety sitting on people’s desks. Sometimes you get an awesome breakthrough that makes everything work, other times no so much.

ITER is designed to be a test bed not the final design. Plenty of advancements have taken place after the design was essentially finalized, but none of them actually help test things like heavy neutron bombardment of a lithium blanket.

That really isn't the major problem. The major problem is that a large institutional effort has gone down a blind alley. It's not that it's making inefficient local choices; the global choice to pursue fusion at all is to blame. The badness of the top level goal will seep down to the rest of the program.

Could you elaborate on why you consider fusion as a dead end?

For fundamental reasons (see footnote), the power density of a fusion reactor will be bad compared to a fission reactor. And because the fusion reactor will also be much more technologically intricate, it will necessarily be much more expensive. Since the balance of plant will be similar (or favorably to fission also; no cryo plant or tritium recovery needed, for example), fission will be cheaper than fusion. And fission is already losing economically.

(*) The fundamental reason is that the power from the fusion reactor must radiate through the wall of the reactor, while the power from fission is transferred into coolant flowing through the reactor. As a result, the volumetric power density will be lower by roughly a factor of (diameter of fusion reactor) / (spacing of fuel elements in fission reactor), for equal power/area through the walls of the fusion reactor and fuel rods. ITER, for example, has a power density 400 times worse than a commercial PWR primary reactor vessel.

You make good points, but I think you should revisit your assumptions.

Fission is so expensive mainly because of safety and regulatory concerns: radioactivity scares people, and so safety requirements are cranked up to 11: everything is overdesigned just so that current regulator is confident enough (next one might one up anyway), lots and lots of concrete and steel is used for containment etc. Additionally, the whole construction process is hopelessly bureaucratic and takes forever, requiring inspections of inspections, redesigns for things that in, say, coal plants would just be redone differently by contractor on site and so on. We most definitely could make fission significantly cheaper, and in fact it has been so in the past.

Now, fusion does not have similar safety requirements. If the plant blows up for some reason, there's no significant radionuclide contamination. Therefore, there's little reason to require safety standards significantly higher than those for coal plants (the boilers of which can also catastrophically blow up). This makes it significantly cheaper, due to reduced material use and construction time/labor for containment, and reduced regulatory delays.

Your point about power density is also good, but I don't think it matters very much in practice. Sure, fusion reactor might have significantly lower power density than fission reactor, but so what? Renewable plants are even worse when it comes to power density, yet it doesn't stop them from being rolled out. We don't lack space for fusion plants, even if they are much larger than equivalent fission ones.

Fission is expensive because all the components have to be highly reliable, for safety reasons.

But in a fusion reactor, all the components also have to be highly reliable, because they will be so radioactive that repair will be difficult or impossible. The stuff that's hot in a fusion reactor will be far more complex than the stuff in a fission reactor, which is technically very simple.

The reliability is costly, regardless of the reason for it.

Of course power density matters. The cost of manufactured items is related to their size, and fusion reactor cores will be far larger than fission reactor cores (as well as being far more intricate). Are you saying this 10-100x larger size will NOT be reflected in the cost?

Renewables have lower power density, but also allow omission of entire parts of a thermal power plant. No steam turbines, no heat exchangers, no cooling systems. They also tolerate far less reliable components, because failure affect just a small part of the overall output of a field of turbines or PV modules.

yep. should be focusing on the monkey, not the pedestal. but it's hard to convince unenlightened bureaucracy.

Thanks for the rabbit hole. It made me discover an interview [0] with Googles captain of moonshots. In it he also discusses other interesting moonshot management strategies, besides the monkey-pedestal one.

[0] https://www.wsj.com/video/astro-teller-on-moonshots-for-alph...

I worked as a contractor at X for two years and one thing that really contributes to their success is the fat dump trucks of money they have to spend on things. Astro is a super nice guy, but I worry that X gives themselves lots of credit for being master innovators when in reality their secret may be that they have a firehouse full of hundred dollar bills they can point at things and see what sticks.

Tbh I may be way off base but I was really surprised at how much money they spent on things.

I also worked at an institute which badgered us to go after 'low hanging fruits' and build a whole bunch of pedestals and ended up with a tv screen and a CGI monkey reciting Shakespeare on a shoestring budget.

I'd would have worked on training the monkey to press buttons the same way they domesticated the fox in siberia, but it would take too long and they didn't want to deal with the hard regulatory stuff.

Strange, we used to be able to pull of megaprojects like the Manhatta Project or the Apollo Program.

The Manhattan project actually had a relatively small number of physicists driving the program. Additionally they decided to allow failures. There were dozens of other projects under the project that all ended up as failures that no one really has heard about.

That’s a really interesting point. Do you have info on those failed projects? Would love to read more about that.

The problem is that fusion is not rocket science. It's way harder.

By the time SpaceX was founded, we've been shooting comnercial rockets into space for half a century, and had flown humans to the moon multiple times. Some application of modern technologies allowed to do the next logical step, a cheaply reusable first stage (expensive reusable space systems existed for decades, like the space shuttle).

Fusion is nothing like that.

We can reliably blow fusion-based nuclear bombs. That's mostly it. We don't have a reliable reactor that would run any close to breaking even (a fusor does not count). We have no experience burning the plasma for more than a few hours, before losing it to internal instabilities. Certainly we have nothing similar to commercial energy generation furled by fusion. A plethora of approaches exist because no common design is known to reliably work.

This is not a moment for a SpaceX. This is not a moment for an Apollo effort. I'm not even sure it's a moment for a Manhattan project type of effort. We still seem to know too little to spearhead an engineering effort. I'd be glad to receive a qualified refutation of my points.

Watch Robert Whyte's presentation. https://www.youtube.com/watch?v=KkpqA8yG9T4 The basic problems have all been solved, the science has been solved too. Tokamaks have already made net fusion power. The only difference between JET, which already succeeded, and ARC, is using Magnets with 2.6x the field strength. The B^4 power scaling, gives you about 50x the power, so ARC will generate 500 MW of power whereas JET (remember, already proven engineering), generated 10MW.

So without a doubt, this is just an engineering problem at this point. A prototype device that didn't have to be maintained, could be built exactly like any other Tokamak, just with REBCO magnets, and you'd have the required fusion power.

To get the required electric power, and add in maintenance, you need to at the FliBe liquid blanket, replacable toroid sections, and a heat exchanger, but again, these are known engineering problems with known solutions, akin to knowing that landing a rocket is possible, but figuring out the grid fins, heat shield, landing legs, and guidance software.

Fusion devices on a small scale have been proving the plasma physics since the 90s. We don't have fusion yet simply because of people being conservative and sticking with traditional "known to work" magnets, which is why ITER is a huge, hulking monstrosity.

As Dr Whyte says, cost scales by R, R can be reduced by a factor of 10 from ITER simply by using different magnets, so ARC will produce the same power as ITER, but significantly smaller and cheaper.

The basic papers have been peer reviewed by Nature, and the science and engineering have been found solid, and thus, the economic arguments as well. Ergo, this is exactly like the SpaceX scenario vs ULA and NASA (e.g. SLS). Small allows fast iteration, fast innovation. ITER's going to spend $20 billion and 20+ years of development just to turn around a realize that the follow on to ITER will need to use HTS magnets and a smaller design.

> Tokamaks have already made net fusion power.

Nope. The record for Q is .67 set by JET in 1997. The whole justification for ITER is that it would be able to do Q > 1.

Why do you think that net power has been achieved?


> The basic problems have all been solved

The basic problems of power density and complexity have not been solved. Putative REBCO magnets do not solve these problems. Yes, these are engineering problems, but engineering problems are perfectly capable of being fatal. After all, fission has faced only engineering problems since 1942, and yet still has failed.

You could make an argument that Fission has succeeded on a technical front (large amounts of electricity are being generated for multiple countries at this precise moment), and that most of its problems are social.

No, the primary problem of fission is economic, not social. That's why it's been a global failure. It simply has not been able to compete with the alternatives.

And this is why fusion is such a boondoggle. It's similar to fission, only even more complex and expensive. It's like the opposite of Keep It Simple, Stupid.

No, Pressured Water Reactors (LPWRs) are large, expensive, dangerous, etc. Molten Salt Reactors, especially Thorium Molten Salt Reactors (e.g. LFTR) are much much smaller and passively safe.

What you're missing is that cost scales with volume. If you use a technology that allows you to reduce your reactor size by a linear factor of 2, you reduce its reduce volume and cost by a factor of 8.

Likewise, while ITER is uber expensive like large PWRs, ARC is 10 times smaller, and a factor of 30 times or more cheaper. It's also not as complex as some fission reactors, the most complex elements being the HTS magnets and the blanket.

This is the problem with ITER and PWRs, they have convinced the public that nuclear energy is inherently dangerous and expensive and impossible to make substantially safer, cleaner, and cheaper, but that is false.

You should look up at the history of LPWRs vs MSRs from the guy who invented them. There's a documentary on the Oak Ridge Molten Salt Reactor experiment. He himself thought LPWRs were too dangerous requiring ginormous size and shielding, but the US government and cronies, preferred LPWRs, and so MSRs were abandoned, despite successful trials.

The Molten Salt Reactor designs I have seen have lower volumetric power density than PWRs. The MSRE reactor had a volumetric power density of 2 MW/m^3, a factor of 10 lower than a commercial PWR (counting the volume of the entire primary reactor vessel; the power density in the core of a PWR is a factor of 5 higher than that.) Of course that was an experiment, but look at something like Moltex's design and the power density is still lower than a PWR, last I looked.

> Likewise, while ITER is uber expensive like large PWRs, ARC is 10 times smaller, and a factor of 30 times or more cheaper.

The ARC design is smaller than ITER, but it's still very much lower (a factor of 40) in power density than a fission reactor.

The power density isn't the only variable that matters. The difference doesn't matter if the cost to build a Thorium Molten Salt Reactor is much lower, especially in terms of safety and size. MSR's don't need enormous containment vessels, they also reprocess fuel.

Here's some links showing power densities inline with PWRs, or greater:


"Extrapolating from other liquid-to-liquid heat exchangers, roughly 150 MW/m3 is a practical upper limit to the power density of a primary-to-intermediate salt heat exchanger. For comparison, pressurized-water reactor core power density is ~110 MW/m3." https://info.ornl.gov/sites/publications/files/Pub29596.pdf

The fact is, there is no reasonable expectation of ever getting viable commercial power from a tokamak-style fusion reactor. The density of energy production, in watts per cubic meter, is so low that an impractically large volume of confined plasma would be needed, requiring a reactor many times larger than a fission plant for the same power output. But fission-driven power is itself too expensive to produce to justify deploying on its own merits.

Furthermore, since the power must be extracted by intercepting high-energy neutrons after they have blasted through the plasma focusing apparatus, such a plant would destroy its most expensive parts in a short time, requiring frequent rebuilding -- by robots, because of induced radioactivity at the work site.

There has never been any expectation of practical commercial power resulting from plasma-confinement thermal-neutron fusion projects. Their only plausible budgetary justification is as a jobs program for high-neutron-flux physicists. They are called for by military policy as a population to draw personnel from for weapons work.

Coal has always been cheaper than fission. Solar and wind power costs are much less, and continue falling; and numerous viable energy storage methods, all demanding no fundamental research -- gravity (solid and aqueous), underground and underwater compressed air, LH2, ammonia, methane from captured CO2, even powdered iron -- vie only for which will end up cheapest to deploy and operate.

If a commercial reactor were ever built and made to work, it would take many decades just to break even on the money already spent to date, before bevinning to pay down its own construction -- even presuming it could find a market for its output. In practice, without continued public subsidy as long as it operated, its debt would continue to grow indefinitely.

The only responsible course for the future is to cancel all publicly funded thermal-neutron fusion work, immediately.

I think there is a slim chance that nuclear could remain relevant. The key would be greatly reducing the cost of the non-nuclear side of the power plant. Perhaps the best approach would be replacing large steam turbines with far more compact supercritical CO2 turbines. This would require temperatures somewhat higher than from PWRs. And this (not thorium, not waste destruction, nor really even safety) is perhaps the best argument for molten salt reactors.

Nukes remain potentially relevant for space systems.

For example, in the cloud tops of Venus, an air turbine may be constructed from a very large, balloon-supported polypropylene fabric tube with an air turbine on top as its only moving part, and a no-moving-parts naked atomic pile suspended near the bottom, heating air directly -- by neutrons colliding with atmospheric CO2 -- that rises to drive the turbine. Some of the air would become radioactive, but so what?

In free-fall, a reactor at the end of a long-enough tether would present no risk of environmental contamination or user safety, and so could dispense with both the containment vessel and shielding. Even there, though, pB seems a more atractive goal than thermal neutrons, because destroying your plasma focus apparatus every few months would get tedious.

Nukes are absolutely needed for any deep space missions beyond Mars. Unless you're going with laser propulsion or solar sails.

Moreover, if you can fit a compact reactor (fission or fusion) on a space ship, then you can run nuclear thermal (ISP >800), or nuclear ion engines with extreme efficiency. (>15,000 ISP!) Considering NASA's Kilowatt reactor weights 2tons, assume a dry mass of 10t and a payload of 2%, you end up with a 500t ship (wow, do we even have 498 metric tons of xenon in the world?), but with a delta-v of 500,000 m/s. That gets you to Pluto flyby in 99 days. (I didn't count the Kilowatt reactor fuel weight, I'm not sure how much uranium is needed for a 15,000 Isp burn, I was just interested how much delta-v 15,000 Isp buys you.)

SpaceX starship, if it works, could launch this in about 4 refuelings. (>100-125t to LEO)

Space applications cannot justify any significant use of nuclear power on Earth. To the extent space uses are valuable, the relevant stakeholders can pay for it.

Nuclear reactors would be wonderful for use on Titan, btw. Simple open Brayton cycle systems would be highly effective at quite reasonable core temperature.

It was mainly an engineering issue: materials, manufacturing and assembling. It is too big to fail right now, and most issues have been solved. It never was as simple as webinars from scientists would make it. The funniest joke within nuclear eng insiders is the marvels of paper reactors pretty always make real reactors just seem glorified boilers.

ITER's too big the fail. But again, the parameters were known. The magnets give you the field strength, the field strength tells you the radius, the radius^3 gives you the cost, and also, some measure of the construction time.

This is one of the great things about Elon Musk -- he looks at things from first principles. When it looked like he needed 1000 Gigafactories to reach his goal, he went back to the drawing board on decades of battery manufacturing, to come up with the Terafactory.

When carbon composite rockets looked to be too hard to manufacture, and have too many other detriments, despite the clear wins in strength-to-weight, he dropped it and went with stainless steel. I'm pretty sure if he was staring at the basic Tokamak equations, he would have long again saw the economic and time dependence on R, and the massive benefits to improving B, and that would have driven him to look for alternatives.

Too many researchers engage in sunk cost / dollar auctions, doubling down on obvious wrong approaches, because they've already spent so much.

This is also why DeepMind won CASP again, because rather than try to iteratively improve their 2018 result, they dropped it as a dead end and restarted with a completely new approach, even though the 2018 approach was also a revolutionary breakthrough.

What I'm getting at is, being able to let go of your previous projects if they are stalling, with huge cost over runs, and a mediocre future return, is a value we should seek in fusion research, and elsewhere.

Ok but ITER was much more than engineering: soft diplomacy, supranational bureaucracy, manufacturing modernisation. Musk comes from a different era and a different environment but, again, launching rockets successfully, with humans in them safely going and coming back, was achieved well before him. Stable nuclear fusion is still unclaimed as a milestone, even if sustainable nuclear fusion might well come from different, smaller approaches the likes we are seeing from United Kingdom and South Korea.

Going back to first principles is how you conclude DT fusion is a dead end. And this has been known for decades.


(That prescient article has stood the test of time, except that advanced fuels were mostly ruled out also.)

Except the article is wrong, because it makes fundamental assumptions about the vacuum wall and lithium blanket, and even the superconductors. The ARC reactor, for example, doesn't use a solid sandwich style blanket, it uses a volumetric blanket of FLiBe, a liquid that acts as the moderator that absorbs neutrons, the breeder, the magnetic shield, and the heat exchanger -- and has proven to work in molten salt reactor prototypes. And unlike the designs he claims that are "essentially unchanged", this significantly reduces the cost and mass of the reactor, reduces the so-called "danger" of lithium, and leverages the fact that high energy fusion neutrons travel further, as it increases the efficacy of the liquid blanket.

The claim that neutrons will damage the metal structure is also handled by many aspects of the design: easy reparability thanks to modular, segmented magnets, being completely submerged in the liquid blanket, low activation steel for structure, and the pulse design which runs the reactor in a way that reduces overall temperature of the walls.

Plus, he brings up the ole aneutronic fusion dream, except that creating a confined Boron-11 plasma is much more difficult, outside the realm of tokamaks, and typically proposes using laser internal confinement fusion at the Petawatt scale.

More than 20 papers were published on ARC, heavily vetted in journals like Nature, and pretty much all of the peer review concluded the design will work, and if it does work, then it will be much cheaper than ITER, simpler than ITER, and thus refute your paper.

We'll know in about 4 years.

No. The article makes no assumptions about superconductors. The argument makes no assumption even about plasma confinement. It's all about the limits on areal power density at the reactor walls.

The conclusion he reaches, that fusion reactors will have a volumetric power density at least an order of magnitude lower than a fission reactor, is not contradicted by ARC. The volumetric power density of ARC (from the ARC paper I linked to elsewhere in these comments) is 0.5 MW(th)/m^3 (counting the volume of the entire reactor, not just the plasma) vs. 20 MW(th)/m^3 for the primary reactor vessel of a PWR -- a factor of 40 worse.

His suggestion to look at aneutronic plasmas didn't pan out, as you note, but that doesn't save DT fusion. I'll note that it's his PhD student Todd Rider who shot down the non-Maxwellian schemes that aneutronic fuels would have needed. After that, Lidsky switched to working on fission for the rest of career and life.

The latest ARC claims from Whyte's last presentation was 3MW/m^3 density, and that's with currently achievable magnets, not those people believe are ultimately achievable. Counting just the PWR reactor vessel, and not the entire containment facility, seems a huge cheat.

We're talking economics of energy. PWR Fission reactors have enormous footprints, averaging 1 square mile. An ARC reactor wouldn't need anywhere need that size. The very nature of using pressurized water means your containment facility has to be a factor of 1000 larger than the reactor vessel to handle a flash boil over.

Current costs to build a 1GW fission plant are $6-9 billion. Predicted costs to build a 200MW ARC reactor are $4-5 billion, so the ARC reactor costs ~5x more at the high end, and 77% more at the low end, for the same output (e.g. build 5 ARC reactors to equal one fission plant). The economics are still viable, and your costs for storage, disposal, and de-comissioning are far lower, as is your land footprint. Note that these ARC costs are for the first reactor, include the sunk development costs, and not the economies of scale that would occur from dropping HTS costs.

> It's all about the limits on areal power density at the reactor walls.

No, he makes lots of assumptions about how the magnets are shielded, how breeding works, and how heat transfer works. All of these are addressed by FLiBe, there's no "structure" to the heat blanket to damage, it acts simultaneously as radioactive shielding for the magnets, heat transfer, and a breeder. The "blanket" in ARC isn't a surface sandwiched with a super structure, it's a volume that's continuously pumped and recirculated.

He also makes the implicit assumption that reactor wall damage is a huge problem. Instead of seeing it as a problem, you should see it as a solution, like an ablative heat shield on a rocket or capsule. As long as the cost of maintenance or replacement is low, and reactor downtime is limited, it is not a problem. Current estimates for ARC, which is designed to be modular, so it can be easily maintained, is that the reactor walls will last about 1 year.

He doesn't even mention the biggest unsolved engineering problem, removing the "ash" (Helium) that accumulates in the plasma.

His article has all the trappings of reusable rocket naysayers in the 80s and 90s, who claimed reusables could never work, because carrying wings, landing legs, heat shielding, and structural reinforcement to make rapid reusability would cut down on your usable payload, and thus doom reusable rockets to always be substantially inferior. And this is because of an obsession with a single variable, like power density, or payload fraction, and ignoring the end to end economics of the entire system, including safety margins, maintenance, footprint, etc.

And that's why I believe ARC will prove Lidsky wrong.

> The latest ARC claims from Whyte's last presentation was 3MW/m^3 density,

That's per volume of PLASMA, not volume of REACTOR. Since we're talking about the size of the things we have to pay for (the reactor) the latter is appropriate.

> Counting just the PWR reactor vessel, and not the entire containment facility, seems a huge cheat.

Fusion reactors will also require containment vessels, to prevent escape of tritium in accidents (unless you're ok with an accident rendering ground water in a large area around the reactor undrinkable for decades.) The containment vessel will have to be large enough to contain all the cryogens heated to accident temperature. Unlike in a PWR like the AP1000, these volatiles cannot be condensed by water cooling. In ITER, for example, complete volatilization of the cryogens would requires several times the volume of the AP1000 containment building.

The ARC design adds to accident fun by having a large volume of titanium hydride shielding near the molten salt. TiH2 decomposes to Ti and hydrogen gas at 450 C, well below the temperature of the molten salt blanket. Fully decomposed, this TiH2 would release 15 tonnes of hydrogen, occupying 180,000 cubic meters at STP.

Beyond that, the containment building around a fusion reactor will have to mitigate tritium leakage by many orders of magnitude. This will be expensive (in particular, polymer seals cannot be used on penetrations as they permeate tritium.)

ARC will also require a large volume for disassembly of the reactor for vacuum vessel replacement, and for remote control demolition of activated vacuum vessels for disposal. All that volume will become contaminated and off limits to human access.

> PWR Fission reactors have enormous footprints, averaging 1 square mile. An ARC reactor wouldn't need anywhere need that size.

Really good farmland in the US is less than $10K/acre, or about $6M/square mile. Cheap range land is an order of magnitude less; really useless land another order of magnitude cheaper. The cost of land there is a very minor part (0.1%) of the cost of the power plant.


> No, he makes lots of assumptions about how the magnets are shielded, how breeding works, and how heat transfer works.

Again, you have not understood his argument. His argument would apply even if you had 100T magnets made of unobtainium and a plasma configuration with beta = 1. All that he requires is an upper limit on how much energy per area can be transmitted through the surface of the reactor. This is a function of the materials of the first wall, and has absolutely nothing to do with magnets or breeding.

> Predicted costs to build a 200MW ARC reactor are $4-5 billion

If you are referring to the cost estimate from the arxiv paper, that is just the estimated cost of the reactor itself, not the power plant containing the reactor. I will admit I found the estimation methodology in that paper very cursory, so it's possible they were overestimating the cost (80% of the cost was the stainless steel magnet support structure.)

A fission reactor core is much cheaper than that estimate, per MW.

> He doesn't even mention the biggest unsolved engineering problem, removing the "ash" (Helium) that accumulates in the plasma.

And why should he? I mean, why does he have to kill the corpse a third time (after the power density and reliability arguments)?

> And that's why I believe ARC will prove Lidsky wrong.

As I said earlier, the power density figures from ARC show he was right.

> By the time SpaceX was founded, we've been shooting comnercial rockets into space for half a century, and had flown humans to the moon multiple times.

I think that's the point. Such ground-breaking innovations like fusion require on-ramp costs with questionable return no private company is willing to invest.

Shooting man in space was prooven decades ago, it seemed natural to commercialize it once there is a business model and be it only collecting government subsidies. Not to deminish the outstanding achievements by SpaceX.

I disagree in a small way.

Telsa / SpaceX / Elon - focused on ground up physics principals. They said things like, how can we make a better battery pack and will the math then work for a good EV car. They showed to their satisfaction that the answer was yes, and built from there to a degree. Their battery / powertrain work has been consistently good before any of the huge capital efforts, and the path forward was pretty well defined.

Same with their rocket engines. They iterate and qualify pretty quickly.

ITER and these folks are spending billions on what is actually ancient tech with really no realistic (I don't think) path to sustainable commercial power even theoretically in any reasonable time horizon. This is the Space Launch System approach, gear up and go on old tech.

We do need someone in a lab (Tesla / SpaceX style) iterating, and then proving out key physics / components. The problem with ITER, you CANNOT stop funding it, it's just gobbling up all the money because the number of jobs now tied to this - it's a jobs machine basically.

At spaceX it doesn't feel like their are a lot of sacred cows. These maga projects are pure sacred cow it feels like sometime. It's private / personal capital driven.

The ground up physics principles for fusion give you ITER at best. Everytime someone has a brand new concept, they get some promising results and then reach the same problem: we need to build a vacuum chamber 10 meters in diameter to see if it'll produce power. And then you're back to ITER - which is what its exactly trying to do (the internal diameter of ITER is...about that scale).

We know rockets work inside an envelope we can see, it's a matter of how to do it cheaply. What we know, today, is that fusion doesn't work in envelopes much smaller then ITER.

I thought the arguments for SPARC/ARC were exactly this though, that the approach for ITER is the one we "know" works - but in the time it's taken we now have the ability to make significantly stronger magnets and so should be able to shrink down the overall size dramatically. And size = cost = organisational size = time.

It's a design we've modelled and predicted should work. Since we've never actually built anything close to it, we have no way of knowing whether those models were even close. If ITER shows results that are even close to breakeven you can expect a rush of newer designs using technological advances to shrink scale and improve performance.

Besides, ITER is also meant to help develop experience in handling large plasmas for extended periods of time, that knowledge will transfer to other designs.

ARC actually is a pulsed design, they design against handling plasmas for long periods of time and actually using shorter duration is a net win for efficiency and reliability.

Also, ITER won't demonstrate anything until 2035. By then, several smaller scale designs would have already demonstrated break even.

It seems like there are a dozen or more fusion startups that are pursuing concepts that are much smaller than ITER. I'm no nuclear physicist, but it seems like there might be room for innovation still.

The claim isn't that there is only one way to build fusion reactors, the claim is simply that there is only one well-developed design.

Small prototypes don't mean immediate progress in fusion, the original tokamak was hailed as great progress in 1958 and it was less than a metre across.

Until we have perfect understanding of Magnetohydrodynamics and materials science one simply cannot reliably predict the behaviour of completely new devices.

These prototypes aren't scaled down versions of ITER, the field strength of ARC is of the same magnitude of ITER, it's capable of the same pressures and confinement, only in a much smaller volume.

It's like saying a 5nm CPU design is somehow the same as a 32nm CPU, even if they both had the same die size, even those the 32nm versions had far less transistors.

I never said they were versions of ITER, my point was that the new prototypes themselves need to do most of the upscaling work that's gone into tokamaks. As for Arc, that's simply a tokamak that can't be built yet because the tech isn't ready, not sure what that is supposed to contribute, did you read the rest of this thread?

To summarize:

. <-- the point

                             --> your comment

Which tech isn't ready? The plasma behavior for that volume was already shown by JET @ 1.5 bar. MIT Alcator C-Mod which achieved a record 2-bar pressure that won't be beaten by ITER for 15 years. Arc also predicts a 2-bar pressure plasma. The difference is, they did it with non-superconducting magnets (copper), and paid a huge price in losses due to resistance. Arc's HTS magnets won't suffer the same problem, will have triple the field strength, and will be structurally stronger.

Arc is doing what JET and Alcator C-Mod did (with a radius very similar to JET), just with a much more power efficient confinement system, so the input power is lower, and the magnetic field is a factor of 2.6 stronger. The basic physics show that stronger fields actually make the plasma more well behaved, not less, so the major variable vs JET (a proven existing tokamak with Q=0.67) that's changing is not likely to lead to surprising results.

Tesla/SpaceX are not doing anything closer to the research needed for Plasma discovery, they just put in industrial production technologies that already existed. Some of these technologies, like space exploration, existed for 50 years before SpaceX started. If you're looking for a model for Plasma exploration, the model should be closer to Manhattan Project/Man on the Moon rather than yesterday's tech that SpaceX deals with.

SpaceX is very nearly an exercise in how messed up the government procurement and funding process is in the US. A lot of SpaceX's innovations are things like "maybe you friction stir weld aluminum panels together rather then machining a solid block into a tube". This is innovation that could've been done anytime in the past few decades by the big players, but they never had an incentive.

Even the Raptor engine, though it was never widely used, was initially built by the Russians decades ago. The concept is known to work.

SpaceX's innovation was looking at the market and deciding that Iridium was right but about two decades too early. A lot of people in the industry (and a number of governments as well) lost big when Iridium went bankrupt, and quite understandably the large players were content to stay in their lane. After all, ULA came about basically because the DoD needed two technologically diverse launch platforms, but wasn't willing to commit to the launch cadence needed to keep both commercially viable under separate parent companies. If there hadn't been a future with customers ready to fly, and launching enough to survive a couple launch failures, SpaceX just would have been the upstart with high insurance costs.

Hardware-wise, SpaceX did a great job gathering the best of what had already been done and proven. Friction stir welding? Used to build Delta II, Delta IV, Atlas V, Space Shuttle tank and more. Pintle injectors? Lunar Excursion Module. Semi-balloon tanks? Proven on Atlas I and II, and Centaur. Then, once they had something that flew, and a contract from NASA to keep it flying, they iterated on it to make it reusable. Much like the story of Boeing with the KC-135 and 707, a key early contract was enough to support the program and subsidize further development.

> Same with their rocket engines. They iterate and qualify pretty quickly.

They iterated starting from a LOX/RP-1 rocket engine cycle that has been proven to work and in widespread use for half a century. Fusion is totally different.

Look up SPARC and the CFS. It sounds like they're doing exactly what you're describing.

I don't think SpaceX came in to play when it was "may or may not work", tech is mostly developed by NASA and others over decades.

SpaceX's value add is eventually providing economies of scale.

> SpaceX's value add is eventually providing economies of scale

This has become a popular line of contrarian bunk as of late.

I am an aerospace engineer. I am also a Tesla valuation sceptic. The work SpaceX has done is revolutionary.

In materials, numerical methods, logistics and technology, they have pioneered. Common knowledge in 2001 was reusable launch could not compete with expendable vehicles. SpaceX overturned that paradigm with a fraction of the competition’s cash. Ignoring that is to reduce the problem to a simplistically technical level that detaches entirely from reality.

> Common knowledge in 2001 was reusable launch could not compete with expendable vehicles.

It was well argued by alt.space community, the kinds of XCOR, Beal Aerospace, Armadillo Aerospace, Scaled Composites etc. that it's not the case. SpaceX is a more successful of this line of groups - and better equipped than many, though not all.

> The work SpaceX has done is revolutionary.

Yes, but mostly not in technology, but execution and optimization in an unusual market area. Success over space dinosaurs tells more about sad state of the competition - though Elon is correct in identifying the deficiency and improving on that.

> It was well argued by alt.space community, the kinds of XCOR, Beal Aerospace, Armadillo Aerospace, Scaled Composites etc. that it's not the case.

They had a hypothesis. It was untested and, by any reasonable definition of common expert knowledge, a far bet.

Saying this list’s existence disproves SpaceX’s technological work is arguing home brewers as evidence that Apple did nothing groundbreaking. Or Alberto Santos-Dumont as evidence the Wright Brothers were just incremental inventors.

Yes, they weren’t the only ones on the bleeding edge. But they were right there on it, and doing the lion’s share of the pushing of it.

> better equipped than many, though not all

SpaceX is miles ahead of all of them. Literally, hundreds of thousands of miles of flight time ahead. Ahead in vehicle manufacturing. Number of engineers working on number of problem domains. Hell, out of your list, only Scaled Composites still builds anything.

If we say SpaceX does no science, we must argue that the Japanese and Indian space programs do no science. Which nobody claims because it’s baseless. Designing new motors and engines and control systems and crew capsules and space suits and ultrasonic reëntry systems requires new science, technology and production expertise.

> They had a hypothesis. It was untested

We can say Newton had a hypothesis regarding Earth artificial satellite, which took centuries to experimentally observe. But regarding powered descent, Delta Cliper flew years ago by the time SpaceX Grasshopper tests, and Northrop Grumman Lunar Lander Challenge showed that again. The untested part was to actually land the stage of an orbital rocket - difference in usage, not principal technology. Wouldn't say it was a far bet.

> But they were right there on it, and doing the lion’s share of the pushing of it.

They were practically alone in the commercial space, but technologically their achievements, I think, more incremental that Wright Brothers.

I'm not saying SpaceX did nothing worthy mentioning. It's the case, IMO, of staying on the shoulders of giants when other commercial giants became to lazy to look forward - indeed, the status quo was rather beneficial to Boeing-Lockheed, while Musk was interested in more long range projects. But Musk wasn't operating in vacuum by a long shot. Andrew Beal could probably reach similar results in a slightly different scenario.

> SpaceX is miles ahead of all of them.

"Better equipped" in this context is mostly "had almost a hundred millions to spare on first project development". That kind of money was't available to XCOR, Armadillo Aerospace, Masten Space - but probably was available to Beal Aerospace and certainly to Blue Origin (which is somewhat to the side of this list), maybe to Kistler Rocket... It's not about ideas and achievements, it's about starting conditions.

> Hell, out of your list, only Scaled Composites still builds anything.

It's almost two decades since founding of SpaceX, and this is capital-heavy area, surely only commercially successful survive. Blue Origin is practically only exception. Doesn't tell about technological advantage.

> If we say SpaceX does no science, we must argue that the Japanese and Indian space programs do no science.

SpaceX does applied science, targeted towards rather immediate applications in commercial systems, be that Raptor of Starlink. Here at least we can agree that a degree of science work is required for projects of that kind.

SpaceX has done critical work on GPU accelerated computational fluid dynamics for modeling rocket engine combustion: https://www.youtube.com/watch?v=txk-VO1hzBY

And saying this is about money and resources, and not focus and management, ignores the fact that SpaceX spent less than ULA and Blue Origin to develop 2 rocket engines, 3 rockets, three different recovery systems, and 2 different capsules. Blue Origin has been spending $1 billion a year of Jeff Bezos's money, and despite starting before SpaceX, still hasn't reached orbit or beaten the Falcon 1.

Rocket Lab's Electron made it to orbit with $210m of total funding, and they built their own engine with an electric pump. Far smaller than many other companies.

The reason why SpaceX is so successful is because Elon Musk isn't in it a vanity project, or to make money taking a slice of the LEO sat launch market. He has a vision, call it lunacy, of getting huge numbers of people to Mars. Working backwards from that goal, informs everything they do, which is why they won't settle for an incremental improvement on what ULA or Arianne is doing.

It's like saying if you gave equal money in 2001-2006 to someone to build better mobile phones, they would have produced an iPhone, because most of the components were there. I worked in the mobile market at that point, and I can tell you most people were chasing Blackberry, or imagining slightly better versions of say, an Nokia Communicator 9000, iPAQPhone, or OQO. The iPhone took a vision, and someone willing to tell the naysayers and corner cutters that he doesn't care about their objections, this is what they're doing.

For TEN YEARS on USENET, I read sci.space as people like Henry Spencer and other luminaries decried reusability. They rightly said that anything wings, or landing legs, or extra strengthening and margin you add to have repeated flights comes out of your payload. The major consensus of the time would be to tell Musk his ideas won't work and the physics say disposable rockets are the most efficient.

But marginal cost matters more than Isp or payload fraction, and if you have a system where the majority of your costs are amortized over many launches, lower payload is less relevent, you just do more launches.

Ultimately, SpaceX will prove the naysayers wrong, as surely as he proved the establishment and ULA's allies wrong.

> SpaceX overturned that paradigm with a fraction of the competition’s cash.

How can we say anything about SpaceX's efficiency with cash if there are no public statements available and even investors have trouble getting any financial data at all?

We know how much they've raised, how much they charge for launches, and how many launches they've done.

And we know the VAST amounts of cash devoured by SLS and the traditional folks who critiqued SpaceX approach.

> And we know the VAST amounts of cash devoured by SLS and the traditional folks who critiqued SpaceX approach.

I was going to say to look exactly at its endless money pit counter-part: SLS from ULA. Nothing will ever compare to that monstrosity that will never work: $7.2 Billion and growing [0].

Also, not an Engineer, but went to interview at both Tesla and SpaceX and SpaceX's internal facilities are jaw-dropping. People love to go on about Tesla's Fremont factory, which is awesome, but I've been to Fremont and Boca Chica. And Boca is by far more impressive; I lived and worked in Emilia Romanga in Italy, and also lived in the town nearest to Stuttgart in Germany so I've have seen my share of amazing engineer and logistical/distribution feats.

I saw them starting to stack what was to become SN3 (RIP) in real time in the hangar as I saw SN2 vertical on the launchpad which that still remains one of the coolest things I've ever seen in my Life. I was basically leaning up against one of the Raptor Engine prototypes they have pretty much just around.

I've also got a few glimpses of the Hawthorne factory from (the outside) when I was charging the Model 3, lots of cool stuff comes out there (namely Falcon 9 and Merlin and Raptor engines) they have so much cool stuff just lying on the side of the building which I took pictures of but was asked to delete by security, which I complied.

0: https://oig.nasa.gov/docs/IG-20-012.pdf

> SLS from ULA

Just one note on that: Boeing is the company building the SLS core stage, not ULA. While ULA is partially owned by Boeing (joint venture between Boeing and Lockheed Martin), they operate independently.

ULA can't quite match SpaceX for style or cost these days, but they are consistently much better at what they do than Boeing is. Their launch success record is extraordinary, and their rockets provide certain capabilities that Falcon 9 and Heavy can't match without adding a kick stage to the payload.

Yeah, having taken a tour of their Hawthorne factory, it is awesome. I didn't really grok how large the Falcon 9 was until I stood next to it. It gives you a visceral appreciation of the feat that is landing these pencils on their erasers, when the pencil is as tall as some smaller highrise buildings.

> Yeah, having taken a tour of their Hawthorne factory, it is awesome.

How long ago? Did you get to hang out and eat at the mezzanine?

As for how daunting, maybe it's just because I've been into Motorsports for most of my Life and Aerospace is the most often transitioned to Industry from Automotive I knew (as much as anyone not hands-on with the project could be I suppose) what an immense feat that was. Also I was at the Mars Desert Research Station and got to ask as many questions as I wanted about it, and I'm also a member of The Mars Society.

BUT I still get giddy and have goosebumps going back and watching highlights of the first successful landing as I saw it on the streamcast with a friend who is doing his now doing his Masters in Iceland and wants to be the first wave of Martian colonizers because of that event.

I also saw the first re-landing mission at Vandenburg in 2018, the one that lit up the sky and made people think it was 'aliens' all over SoCal. We saw it from the Eagle's Nest where a bunch of NASA and SpaceX guys watch it from and met one of the non-crew Dragon capsule pilots and you could still get an idea of the scale of it from up there as I've seen the F9, too.

But SN2 at Boca was breathtakingly massive when it stood in its full glory on the launchpad, and to think this too will be reusable but will take Humans to Mars makes feel think the World still makes some sense.

The tour was a few years ago now, and sadly, didn’t get to eat there.

I’d love to see one of the SNs at some point, they look like they dwarf the F9.

So how do we know it's not an Uber style, use huge amounts of funds to lower prices, undercut the market, and become an unaccountable monopoly?

Their system is inherently more economical, because they're not throwing away the rocket, engines, or even the fairings. They're not undercutting in an unsustainable way by using outside funding.

We know that Falcon-1 took 90 millions for the whole program. We know that Air Force evaluation model predicts about 4 times as much expenses than what Elon shown with receipts to NASA. The model was wrong, but it took SpaceX to demonstrate that.

Similarly we have good estimates for how much it took to develop Falcon-9, up until first launches. I think we have rough data to make some comparisons.

> even investors have trouble getting any financial data at all

SpaceX provides audited financials to its (preferred stock) investors. They also highlight them every funding round.

It reminds me of the leap made by Oculus. Easy to overlook.

>In...numerical methods they have? this is news to me.

The relevant paper is Lars Blackmore's Lossless Convexification ( http://www.larsblackmore.com/losslessconvexification.htm ), which (to my understanding) he basically went to SpaceX to put into practice/scale, and which drives the rocket's realtime control adjustment for landing.

It should be said that the methods were developed before he joined SpaceX and were used at NASA / JPL for at least on Mars mission.

GPU acceleration of dynamically scaling cell-based fluid dynamics simulation. Video lecture / demo from 2015; 45 minutes.

I don't know if it was state-of-the-art but it looked impressive to me as a layperson.


They developed a new rocket engine and the ability to land upright, both significant engineering challenges, which it seems is what the problem is currently for fusion (i.e., we need to build and iterate on things that mostly seem they should work)

I don't agree the problems are comparable at all. The original Merlin engines were modern iterations of proven technology and vertical landing rockets were already proven technology dating back many decades to the DC-X [1] and Apollo lunar lander. What SpaceX did was cut all the bullshit out of the space launch industry that had grown incredibly lazy due to lack of competitors and revolving doors with the government, and then modernize everything with iterative development. In comparison, commercially viable fusion has never been shown to work in the real world at all. The Raptor engine, however, really is a massive achievement and is somewhat comparable to developing fusion power. But if SpaceX had started with it, they probably would have failed due to how much time, money, and expertise it took to get working.

[1] https://en.wikipedia.org/wiki/McDonnell_Douglas_DC-X

> The Raptor engine, however, really is a massive achievement and is somewhat comparable to developing fusion power.

Disagree, the theory of full-flow gas generators, and test bench results, were known for many years. At the same time with fusion we only have experience with uncontrolled one (bombs), or theory of how stars operate (and we can't reproduce that mechanism, not enough gravity). There is a lot more science to discover in fusion than in Raptors.

Ok, so then someone should start a company that iterates on fission (proven tech like Merlin) and gets the cost of that way down and safety way up, then uses that momentum to work on fusion, which (like the Raptor engine) is a whole 'nother ball game, but you can still use some of the expertise from fission for fusion.

The analogy holds!

I think fission to fusion is more like steam engine to jet engine. Fission can happen by _accident_ if you have the right materials at energy levels of just dropping something on something else [0].

Starting fusion isn't easy, but as I understand it, that's not even the hard part. Containment is the big deal, because of the energy needed.

[0] https://en.wikipedia.org/wiki/Demon_core

There's a saying that fission is like playing a mini-golf course where you have a flat surface peppered with holes, and every time you hit a hole you get ~2 new balls. Fusion is like a mini-golf course which is a volcano mountain with a tiny hole on the top.

Isn't the Sun fusion happening by accident?

Yeah, but the Sun only works because of its unfathomably massive size, compared to anything we can do on Earth.

If you take 800 cubic meters of plasma from the core of the Sun, about the same volume as ITER, it only produces 216 kW - comparable to the engine in your car.

And in that volume, you will have an average rate of 1500 hydrogen fusion reactions per second. Compared to ITER, this is something like twelve orders of magnitude slower. That is why ITER needs to go to much higher temperatures than the sun.

You have made an error somewhere. 1500 fusion reactions per second is maybe 10^10 eV per second, which would be around a nanowatt.

Yeah, you are right there is a mistake. I made a typo in Avogadro's number. Sorry about that!

The power density is correct, but the number of reactions is too low by a factor of 10^10...

Sure, and solar power is cheap fusion energy.

Getting fusion energy without gravitational confinement is the tricky bit.

Fission and fusion is a quantum leap. It's like going from the Babbage Differential Engine to a Ryzen 7 1700X.

You can't iterate your way from fission to fusion, but you can iterate your way from the Merlin to the Raptor.

The Raptor which, by the way, is not even that far ahead of what came before it.

Don't actually think you can iterate from Merlin to Raptor.

> Don't actually think you can iterate from Merlin to Raptor.

In some sense Earth science had practically do that - from Goddard's 1926 pressure-fed/piston pumps feed via von Braun's open loop of 1944, via Isayev-Melnikov staged combustion of 1949-1959 to Glushko's full-flow combustion of 1967.

Slow, and engines were designed from scratch. But still some iterations and evolution.

I don't think that landing on the moon with lunar gravity and no atmosphere is at all comparable to surviving hypersonic reentry speeds and landing on such a tiny target. And reading the Wikipedia article, it doesn't sound like the DC-X ever demonstrated a landing from orbit, which is most of the challenge.

Space X have never landed any rockets from orbit. Only their 1st stage has landed, which mainly goes up, its nowhere near orbital velocity.

That's fair. Perhaps I should have said "from a trajectory that could plausibly lead it to orbit."

Go back five years, to December 2015, and you’d find most people on Hacker News deride SpaceX’s landing attemps as impossible at worst or economically infeasible at best, three weeks before they first succeeded.

Hopefully you (like me) were one of the ones cheering them on, but achievements that seem obvious in reterospect usually have had notoriously shaky statements about how obvious they were.

Obviously the proponents are vindicated today, but it becomes a dodgy game of definitions when talking about how everyone who really had any insight knew that it was inevitable.

Here's an interesting news story about Airbus Defense and Space (formerly Astrium)'s reaction to SpaceX, in January 2014:


What's fascinating about this is how NO ONE was talking about them getting the first stage to be reusable. Musk's plan looks like it exploited this blind spot. Reusing the first stage was so obviously impossible that even with the evidence of nine first stage engines staring them in the face, they thought this was just to minimize development cost. It was only when the legs went on that the penny dropped.

And now, no one else will be able to repeat the free R&D flights that SpaceX got by trying to recover first stages off expendable launches sold at expendable launcher prices.

Thanks for digging up the concrete data backing up my claim. I was certain that I correctly remembered every observer that counted being blindsided by SpaceX’s success, in spite of their announcing their progress and stating their intentions time and time again.

One really has to be meticulous to stay honest when looking at things in reterospect. And, by extension, when looking at lofty-sounding predictions about the future.

Well the opinions on Hackernews are only a little better than those on DailyMail when it comes to sibjects outside IT industry.

They weren't the first to build a new engine or an engine of that type or land upright. They just combined previous achievements in a better way. Their technology is incredible but it is all incremental improvements combined with a sustainable business model.

Their major accomplishments were economic, not technological. Their crown jewel Raptor, applies full-flow staged combustion principles that were known and demonstrated on soviet test stands since the 60s.

Fusion power still needs to overcome fundamental, hard physics problems.

My understanding is that the physics are actually pretty well understood but it's just a matter of executing on them. Fusion is possible but we just need someone to actually... do it. Which AFAIK is the point of ITER.

ITER will NOT in any reasonable time frame actually do it, and will not be a commercial success.

Talk to someone in physics closer to this. The time scales ITER is operating on are so long term the tech is ancient by the the time they are actually operating it and the path to commercial power is - basically not there I think.

I don't think ITER was ever intended to be a commercial success and it doesn't have to be. It's literally named the International Thermonuclear Experimental Reactor.

It is a research reactor. It goes first to prove it is at least possible to build a power producing reactor. That's all it has to do. Just exist.

How can the tech be "ancient" if they are the first ones building it, even if slowly. There may be many new designs by the time they deliver on an older one but that hardly makes the tech "ancient".

Obsolete on Arrival is what's implied i think. If deuterium–tritium fusion experiments start sometime after 2035 then it's potentially "ancient tech" given the accelerating tech timelines (singularity) which is exponential in character.

You can't wave your hands, chant "singularity" or "exponential growth" and actually have made an argument. Someone has to do the research - the singularity is not coming in 2050.

Progress in computer science, and the absurd utility of data handling improvements to other fields, has created an extremely unrealistic expectation in the minds of computer tech adjacent professionals about how quickly technology can actually develop, because the core technology of computing (more transistors) had a long period of exponential improvement due to the nature of how it's produced (lithography on 2D surfaces). Feature size shrinks give you exponential improvements.

But that period is ending for computing - there's no more room left in the silicon chip regime to keep doing it much past 6nm. And progress in other fields enjoys no such advantages: cellular biology for example, can't exponentially increase the time it to takes to grow a culture. They also can't shrink it easily - scaling biological research generally involves just scaling in size - facilities, space, grad students, everything. It is much more linear - there are very few easy wins beyond what improved informatics handling gave us (and where we did get big wins was the same mechanism - the availability of cheap silicon handling led to a rush to look for ways to capitalize on cheap silicon handling).

There's a reason computing is powering better surveillance, but brain-computer interface technology is moving exactly as fast as there are people working in the field, doing experiments, developing technology and doing slow, uncool research like patch-clamping single cells or whatever. Because the fundamentals are diverse, and there is no one single technical improvement which gives you a big scaling improvement (and everyone is hoping they'll find one).

The design of ITER was finished in 1998. Technology have changed a lot since then,although no one have built any near the size of it because of the expenses that are pretty much guaranteed to not give any money back.

That's fine but to act like new small scale experiments and designs obsolete technology that has never before been developed at scale is silly and frankly runs contrary to how technological and scientific progress are made. I'll never understand the biases of theorists. They act as if all theory isn't deeply indebted to practical learnings. You don't commit to a project like ITER because you believe it will turn out a cutting edge solution to a problem. You do so because you believe that whatever happens, in the end you will harvest many learnings that are widely useful.

First to land upright from orbital flight.

On top of the great points made by sibling comments the Apollo Lunar Modules landed propulsively from orbit with an additional party trick of having humans on board.

Also Blue Origin actually did propulsive sub-orbital landing from space to Earth before SpaceX, although admittedly from a lower altitude and velocity. Both came after the 1990s era Delta Clipper. Before that there were probably a lot of Apollo era test vehicles.

Each one got a bit higher and faster. This is the value SpaceX offers. They're doing the hard work of making things we know are possible also commercially viable.

Not orbital. Sub-orbital. Only their 1st stage rocket has landed, which mainly goes up, its nowhere near orbital velocity.

They plan to land the 2nd stage of starship, which will have reached orbital speeds. But that is a huge challenge that they have not attempted yet.

When did SpaceX ever do a powered landing from orbital flight?

The first ones that did, by the way, are the Soviets in the early 70s.

Do you mean on Earth? If you are talking about Soyuz, that was actually 1967!

If you mean space probes it’s actually more impressive, the Soviets landed a probe on the moon in 1966: https://en.m.wikipedia.org/wiki/Luna_9

Ah sorry, I meant in atmosphere! But you are of course right!

The retro-rockets on 1976 Viking lander in Mars don’t count? Of course that had a parachute to assist, but I think it was still going really fast.

Sounds like the Russians actually did it in 1971?

Also since we are already speaking extra terrestrially Apollo 11 landed propulsively in 1969, with humans on board and it got back to orbit!



I think reusable rocket was definitely "may or may not work".

Reusable rockets were at "They most likely would work, but it's pointless since governments are willing to spend billions on rockets; in fact, lowering price to space would be very bad for our business".

Compared to space tech, fusion science is at the stage of 1929 sci-fi drama "Woman in the moon".

For context: 1926 was the first liquid fueled rocket. In 1929 it was not yet clear that this was the technology that would take us into space. That sounds about right for fusion. It is probable that we have built a primitive version of what will eventually sustain a net-positive fusion reaction, but it may only be obviously a milestone in hindsight.

spacex was laughed out of the room by industry and press when they announced trying to land the first stage. then they did it.

reusable rockets were uniformly assumed to be a fool's errand.

This is a complete fantasy, people were working on reusable vehicles for decades.

SpaceShuttle was already partially reusable and operational, just like falcon 9 is, just was not economical.

SpaceX doesn't have a fully reusable vehicle yet, and there was a program to develop a fully reusable replacement for Space Shuttle


If Elon was laughed out of somewhere, thats to do with Elon's credibility at the time.

then space shuttle was designed to be reusable. spacex had 20-30 years of data to build off

> the space shuttle was designed to be reusable.

No, actually the PR was that it was reusable, but the design was not. Each shuttle had to be remanufactured after use, taking up to a year and costing $1 billion.

But it gets better. They had "lost tiles" problems on the very first flight, and never bothered to do anything about it.

What a piece of junk.

Not sure why we're getting downvoted. Space shuttle was absolutely a terrible design from a "true" reusability perspective (the most important part of reusability being cost reduction).

Which they shuttered after one disintegrated on re-entry, losing the entire crew.

That was the second total loss of crew in its ~20 year history.

Many comments have claimed this, but I disagree. Plasma confinement as science has largely been solved now. Like the Rocket Equation, Computational Fluid Dynamics, and other aspects of Aerospace, we know the parameters of how to do it, and it is simply a matter of searching the space, reducing the parameters, via engineering iteration.

We also knew that landing rockets was possible, and closed-cycle combustion engines were possible, the physics said so, and even had prototypes that could do it, but with SpaceX, it is not the science that's the issue, or even one-off engineering feats, but practical, economical, scalable production.

With Fusion, we've already achieved confinement times and burns that show we can make it work. But making it work in a prototype reactor is a far cry from making it work in a commercial reactor which needs to be run everyday and maintained. ITER's been going for 13 years now, and I don't think they're solving any of the problems that SPARC will solve.

Yes, tokamak prototypes have provided the insight to help nail down the plasma physics, but small scale fusion prototypes have helped to do, with much faster turnaround time than $20+ billion mega projects.

What SPARC is saying is that small is beautiful. That because of scaling of the B-field to the fourth power, if you focus on stronger magnets, you can dramatically reduce both the size and the cost to build your fusion reactors.

That is, if you double the strength of your B-field, you shrink the size of the machine needed by a factor of 4. So clearly, instead of focusing on ITER, you should focus on the economics and scaling of REBCO manufacturing, and increases in magnetic field you can eeek out because they pay huge dividends.

In other words, we are not longer at the Von Braun stage of rockets, or even the Apollo stage, we're at the SpaceX stage of dramatically reducing manufacturing costs and increasing use.

Likewise, I think Fusion is poised to move past the Von Braun stage of "is this possible" to "can we build a real reactor?" As long as people can let go of huge super-projects, and we can get funding for a bunch of innovative startups in the space to try various techniques and designs for engineering small, cheap, and rapidly.

Likewise, for Thorium reactors, we already know Molten Salt Reactors work, we already know the Thorium cycle works, and we even know a commercial plant works, as Germany ran one at one point. There are oodles of startups now vying to build small scale thorium reactors. The role the government could play here is, like with NASA, the DOE, could provide capital to reduce risk, and get some of the more promising ventures off the ground.

Most importantly, they need to be allowed to fail.

If only the "force" in Space Force referred to propulsion power rather than military domination. :(

"At what cost" is what I'm going to ask. Startups will do it - okay - at what cost? What funding do they need that ITER is somehow consuming their budget?

The US military expenditure for fiscal year $934 billion dollars. That is not the actual spend. ITER, over it's entire 20-30 year history will cost about $23 billion USD projected. So a $1 billion or less per year.

By comparison, the US military budget between FY 2019 and FY 2020 increased in proposed (i.e. the actual spend was more) by $63 billion USD.

So the US, somehow, between 2019 and 2020 found the money for another 3 entire ITER projects. Just kick another 3 20-30 year projects out, fully funded.

Uber has raised $24.5 billion USD to date. A company who's principle asset is an algorithm which dispatches taxis, raised enough money to literally fund it's own ITER project.

So "At what cost" - what are these startups needing to be budgeted for, that are going to be so much cheaper, that they are unable to get any of this money? Set aside the actual defense spend of the US - but I mean, how can you, the US runs nuclear powered carriers, and would have infinite utility for a compact fusion power plant - how is it that ITER is always the problem here, and not that this notion of "we need startups" might actually be running into funding issues because on closer examination they're not actually as cheap, the data is not as convincing, or they're actually about to ask for just as much money.

So again, how much money do these alleged startups need? Why is it apparently so hard for them to get it, when the total valuation of ITER is being thrown around by venture capitalists pretty regularly and the prize is dominance of the next several centuries of power generating technology?

SPARC already raised $500 million.

No one claimed ITER is consuming their budget, if ITER is consuming anything, it's people, actual scientists in academia working on it, instead of working on smaller scale projects. It's an opportunity cost.

Look, even if ITER is a success, it still won't even produce electricity, that's require DEMO, a follow on project. We're talking post 2035 just for the first Deuterium-Tritium operation. Now, do you think by 2035+, which is 15 years from now, commercial fusion reactors are going to be run off Niobium-Tin cooled with Liquid Helium?

Or do you think commercially available REBCO, which needs only liquid nitrogen, and whose superconductivity doesn't falloff as field strengthen increases, will be used? By 2035, whatever DEMO uses of the ITER design will be 30 years old.

Meanwhile, hundreds of smaller scale fusion devices could have been built, with equivalent field strengths, but at far far less cost.

Would you rather run one massive international experiment with dozens of bureaucracies, or run hundreds of smaller experiences, and collect a lot of data from many more designs in the mean time?

We don't need the kill ITER to fund more startup fusion research in this country. ARPA-E funded Solyndra and Tesla. Solyndra failed, Tesla succeeded, paid their money back, and created a new industry leading car company with a market capitalization larger than all US car companies combined. I'd call ARPA-E and DOE's investment portfolio HUGELY successful. If they were a Silicon Valley VC, all the other VC's would be envious.

There is no current scaled up artifact of a REBCO magnet. It's highly expected that one can be created, but they're not rolling off a production line right now.

There definitely wasn't one 10 years ago, nor were superconducting tapes expected to become viable.

So what exactly do you think anyone would have been building if we were investing in fusion 10 years ago? What would "hundreds" of small scale devices be if we tried to build them today? With a technology that is being kept proprietary and this unavailable for use?

Again: we have a whole bunch of small scale fusion devices. This is what the article here is talking about! And until very recently all of them suggested the same thing: you need an ITER sized reactor to be able to get suitable mean free paths, and that Q is going to be all about the b dynamic manipulation of plasma at scale (which is what ITER is setup to study).

Well since YBCO had been known since the 90s, it might have been useful to instead plunge all of their efforts into manufacturing and scaling YBCO, rather than racing ahead with a 30 year project to build a giant reactor with inefficient confinement.

Imagine if the Manhattan project decided to build ICBMs and Hydrogen bombs before they had perfected enrichment. Or if the government decided to fund the full scale model of a National Spaceplane (NASP, remember that?) before they had figured out how to make a SCRAMjet, or store slush-hydrogen properly, or deal with hypersonic heating.

And in the mid-2000s YBCO tapes were stuck in development hell where they couldn't live up to their theoretical promises, and one of the projects I knew about at my local university had got a whole bunch of promising results and then just been dropped because none of it could be replicated. Patented work too.

Not to mention, this was purely for power transmission - no one had any idea if they could hold up to mechanical forces of high power magnets, hell, they had a serious problem which was that high temperature superconductors actually would very suddenly stop superconducting if the impinging magnetic field exceeded a critical threshold.

All this was true right up to 2010 - is still somewhat true today because again, there is no scaled up artifact proving it's actually possible to use.

Everyone who wanted to work on high temperature superconductivity was - it wasn't like this was a field which was struggling. And the story at the time - and in fact still, today - was that it wasn't reliable enough and for ITER definitely would be the wrong technology choice for it's goals - which was to build a facility handling large volumes of plasma in a plausible fusion generator environment.

The 5 year timelines being thrown around like gospel here sound utterly absurd on the face of that. A damn CRUD web app takes a year to build, but an experimental fusion plant using essentially nothing but bespoke parts will be done in 5? Cheaply? No.

If Global Warming were in fact a nuclear-tipped communist regime directly threatening American Way of Life, instead of a creeping international issue jeopardizing food and fresh water, creating distributed violence threatening American Way of Life, we would have had Fusion licked in the 70's or 80's.

> Given the climate crisis, the government should be doing the same as they did with the CRS for space, award DOE contracts of $X billion for a working reactor, dole it out to people doing SPARC or Thorium Molten Salt (LIFTR) designs.

> The innovation rate in Fusion is way too slow, we need to build and fail a lot more rather than sink $22 billion into multi-decade projects like ITER.

We don't need fusion for climate change; fission will do, and we have working reactor designs already. I don't mean that we shouldn't spend some money on fusion research, just that it isn't reasonable to spend 100% GDP on it (or some significant fraction thereof).

> We don't need fusion for climate change; fission will do

I wish it were the case, but I think we have seen that it is politically impossible.

Fusion is a noble goal, and its success is compulsory if humans want to stay on their current track for more than another few hundred years. However, even with a Manhattan Project for Fusion, reactors won’t come soon enough to “save” the human habitat. We need to implement solutions right now today. It’s already too late. Waiting another 50 years to do anything guarantees humans technological empire will not make it to 2200.

Do you think the politics of fusion would be any better? The politics opposing fission aren't grounded in science; do you think the politics around fusion would be?

Yeah the politics would definitely be better because it's deuterium/tritium for fusion as opposed to uranium, which has a lot a cultural and historical baggage and it's on the news in a bad way all the time. So yes it would be better.

Now would it be better enough to get out of the same political/cultural hole that fission is in?

I don't know, but it at least has a better chance.

Also not having to source and mine uranium is actually a huge advantage. We have more lithium then ever, and deuterium can be extracted literally any large body of water.

How so?

Bezos has already effectively ceded “space guy” to Musk, might as well put in the cash to be “fusion guy”. Good way to make sure you’re in the history books for all time

> Bezos has already effectively ceded “space guy” to Musk, might as well put in the cash to be “fusion guy”. Good way to make sure you’re in the history books for all time

I think the advantage of that is if he did and fusion energy was a large portion of energy that powers the Super Charger Network it would amuse him enough to consider putting his considerable wealth to that end.

He really is like a real life Lex Luthor, and when the only way he can one-up his rivals would be to make them dependent on them at rate Royalties +15% (because he has dominant market share and be because he quite honestly 'just could') would be very befitting after having being out-done in the Space Industry. And the reality is that since Solar city is probably the only real Musk Corp he can actually displace, that's where he'd probably get the most bang-for-his-buck and kick-start the 'battle of futuristic futures' by the 2 richest people on Earth that we were supposed to have already.

Bezos also made lots of platitudes about how he wants to help make the entire Earth a Nature preserve: so in case you're reading this (or iithe Saudi's who have rootaccess to his new hacked his phone...) why the hell not, Jeff?

Don’t know about Lex Luthor but certainly jacked Mini-Me

His wealth has also increased tremendously since he tried to be the "space guy" so it makes sense for him to aim even higher now. Now we just need someone to reverse-psychology this into him

Bezos can't cede something he never had... Bezos has had a hobby project no bigger than what SpaceshipOne did in 2004 and has poured billions into it for practically no results. Elon was launching rockets into Space before Bezos was doing anything at all.

It’s funny you mention SPARC because MIT is treating it as a business opportunity. Their REBCO coil winding work is both private and proprietary. This is very much not how the fusion community (and physics at large) do things. The openness of fusion research was a component in ending the Cold War. Researchers building machines right now want HTS coils. MIT is making sure they’re the ones to supply them. Who makes money in a gold rush? The pickaxe company.

Well yes, but... if they crack the nut of how to make an economical fusion reactor work, I'd be happy to pay them to be the monopoly supplier of REBCO tape until the patent runs out.

And to be honest, if they demonstrate the reactor to work, I'm sure others will figure out ways to produce REBCO without violating the patent, or would outright give the patent the finger.

I hate to bring up a spoiler effect because it has been a toxic sour point between competing MCF designs in the past, but MIT is actively pulling funds away from attractive coil winding techniques that are public and subject to peer review. I’m rather close to this issue because I’m aware of grants being given to MIT to make prototypes. MIT is in the US and have a big reputation. The current grants are modest, but they are opening a path for MIT to be paid to develop a technology they will sell. This is all fine. Where it stops being fine is when the developed coil winding technology is proprietary. I’m not exactly sure how this doesn’t violate DoE grant contract rules.

Here is the competition, btw. Paz-Soldan’s technique seems sound and should be funded for prototype coils. I’m not saying MIT shouldn’t be given grants to do this work. I just wish that they wouldn’t use public funds to develop proprietary technology.


>it's almost sure to work

Famous last words...

>we need to build and fail a lot more

You know, that actually describes the early years of fusion research. People thought it would be easier than it has turned out to be. There are lots of gremlins...

I will also note that SPARC is a tokamak, the same as ITER. It's decades of steady tokamak research that has gotten us to this point.

> The Nature Papers on ARC show it's almost sure to work. So it's rather disappointing it'll take 5 years to build SPARC, and then another 10 for ARC.

It's almost sure to work assuming the magnet works. Given that they are still developing the magnet and haven't finalized the design yet, it seems crazy ambitious to me to get to SPARC in 5 years.

And ARC "work" in the sense of a reactor having the weight of a couple of WW2 destroyers.

Well, ARC is supposed to produce an order of magnitude more power than a WW2 destroyer power plant produced, so I'm not sure why that comparison is relevant. The first application of fusion reactors will presumably not be naval ships.

Anyway, assuming your 4000 ton figure is correct, that works out to 125 kW/ton. For comparison, a quick googling suggests a wind power plant is around 3 kW/ton, excluding the rather massive foundation of the tower.

That weight figure (7190 tonnes, not 4000) for ARC is just for the reactor.

https://arxiv.org/pdf/1409.3540.pdf (page 30)

The net output of ARC in that design is 190 MW(e), which comes to 26 kW(e)/tonne.

It doesn't include anything outside the reactor, which would likely increase the weight by at least an order of magnitude. It doesn't include (for example) the cryo plant, tritium processing, heat exchangers, turbines, cooling system, the BUILDING, and facilities for installation and disposal of vacuum vessels.

> That weight figure (7190 tonnes, not 4000) for ARC is just for the reactor.

https://arxiv.org/pdf/1409.3540.pdf (page 30)

Thanks. As I hadn't seen that paper, I used the Fletcher class destroyer, which was the most numerous class the US built, and it was slightly on the larger side at about 2000 tons; common destroyers were around 1500 tons IIRC. I also assumed, wrongly it seems, that 500 MW was the electrical power.

Anyways, if we're really interested in material requirements for various power plant types, one can check e.g. the DOE quadrennial energy review 2015, Table 10.4 on page 390. One can see that fission does pretty well, about an order of magnitude better than wind (and solar being somewhat worse than wind). I would guess fusion wouldn't do as well as fission, but it shouldn't be a huge difference either considering that except the reactor itself it should be pretty similar.

Why are we examining weight, when the cheapest source of electricity is hydroelectric, and they are the heaviest structures ever constructed?

The Hoover Dam weighs like 120 battleships.

Its just the wrong metric to measure.

That's an improvement on ITER, which weighs as much as a WW1 battleship.

ITER is so horrendously far from competitive (in power density, in cost) that a fusion reactor can be much better than ITER and still be completely impractical.

The question is if we would have any innovation in fusion without ITER. It is a research installation meant to prove fusion is viable for generating energy. It's basically designed to be surpassed by later designs.

But it's also like an Apollo program for fusion installations. It doesn't have to be economical, it just has to work.

Tokamak Energy has made 3 fusion devices in 11 years. It's not super fast, but certainly faster than the megaprojects. They are planning on their current design hitting 100 million degrees.

Indeed, Tokamak Energy's whole strategy hinges on fast iteration [1]:

"The basic approach being taken by Tokamak Energy is simple—to combine the two aforementioned technologies to unlock the potential of fusion power in more compact devices, and to do whatever development, testing and iteration are necessary on smaller machines in order to reduce construction timescales and to progress faster."

As of January 2020, Tokamak Energy had raised a total of £117 milllion (~$156 million). I read that SpaceX has raised $5.4 billion [3]. That's the limiting factor.

[1] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6365856/

[2] https://www.tokamakenergy.co.uk/tokamak-energy-raises-67m-fr...

[3] https://www.crunchbase.com/organization/space-exploration-te...

Why do you think a 5 year timeline is too long for the challenge? AFAIK most people think that is way too ambitious.

Spacex took 6 years to get into orbit (and were not the first by decades) and you think they could show energy gain fusion in 5?

Lockheed Martin's approach does what you're talking about. Small, fast to iterate design.

https://www.youtube.com/watch?v=UlYClniDFkM https://www.lockheedmartin.com/en-us/products/compact-fusion...

In most countries there is no policing of corruption and in some countries it is essentially legal and described as lobbying. I think the resistance to these new technologies comes from the situation where decision makers are paid adequately to never put such ideas forward. Fusion could turn many countries economies upside down so they have an incentive to steer away anyone at any cost.

There are "SpaceX For Fusion" projects around. Several have been mentioned already in replies, but one I don't see mentioned is https://generalfusion.com/ (not claiming that this is a valid approach).

This does make sense. If you look at some post world war 2 projects like Dounreay managed by United Kingdom Atomic Energy Authority, things can happen. But with how advanced things are now, can a single government entity manage projects which will require skills from many other countries?

Would you mind adding links to the Nature papers you have mentioned?

surely a fusion reactor failing will have no consequences.

>> The innovation rate in Fusion is way too slow...

I'm starting to think real progress on fusion is suppressed because there might be a risk lurking there.

If you want tinfoil theories, my theory is that the massive lobbying power of the fossil fuel interests delay R&D and implementation of alternative energy. And as some of the alternative energies are getting momentum, like wind&solar, they are also lobbying against potential competitors.

As far as I understand there is a nuclear waste problem, since neutrons are generated which activate the hull significantly. Neutron cannot be deflected.

But I dont believe this is the show stopper why progress is slow. To me it feels more like people know ther e is a technical insurmountable problem ahead (like how to efficiently enough transform the energy to electricity; actually really wondering how this should be scalable) and people involved do not want to reach that point.

Isn't the energy converted to electricity by heating water and generating steam then doing any number of well understood things like spinning turbines?

Getting the heat from neutrons randomly flying out of the reactor to water is a hard problem. The current thought is to cool the reactor with liquid lithium, which itself will be undergoing fission and releasing helium during the process. The liquid lithium and helium would then be pumped through water cooled heat exchanger and some sort of filtering process to remove the helium. Also, all of the plumbing and materials have to withstand radiation 100 times that found in fission reactors.

I’m not saying it is impossible, but they won’t be working on it until ITERs successor sometime in the 2050s

Is there a reason graphite won't work? It's conductive, stable and extracts thermal energy from neutrons fairly reliably in fusion reactors.

I hesitate to speculate, but one possibility is that for large reactors you have so much heat that you need heat exchange by liquids rather than solids or gases or non-atomic particles. My understanding is that solids are bad because electrons and phonons cannot carry as much energy as atoms. Gases are bad because they don't have enough atoms. This leaves liquids as best, which is why we so often see liquid cooling in animals and machinery.

I have no clue, but would love to know.

Water is fine as a cooling medium, the reason for the lithium blanket stuff is to produce tritium for fuel.

So letting the lithium flowing near the plasma?

That is one proposed path; another is direct conversion of high-energy particles to electricity. The former seems much more likely to be feasible, but the latter is potentially more efficient.

What conceivable risk could there be that supports this hypothesis?

Surely with a little imagination you can conceive of some risks that might be associated with having a small sun in your backyard.

For one, fusion reactions generate neutrons that activate all the structural material of the reactor, creating radioactive waste. The waste is shorter lived than fission waste but is still a risk.

A more obvious risk is the pure destructive power of that much energy. High pressure, 100 million degree plasma is full of risks. It's nice that fusion reactors can't "run away" like fission does, but you still have to deal with material 6x hotter than the sun escaping confinement.

When operational a fusion reactor has less then a couple grams of fuel in the vacuum vessel.

Okay, let's do some back of the napkin calculation.

The specific heat of Hydrogen gas is roughly 20 J/gC at high temperatures[1]. It would be much higher for a plasma but I couldn't easily find a number.

At 100M degrees C, a single gram of hydrogen would have over 2 Gigajoules of thermal energy. 2 grams of hydrogen would have the same energy as a literal ton of TNT.

It's not untenable, but its not really something that should be brushed off as "its only a couple grams of fuel" or "it can't have a runaway reaction".

[1] https://www.engineeringtoolbox.com/hydrogen-d_976.html

Having energy does not demonstrate danger to life and limb. 2 tons of fat have more energy than 2 tons of TnT. Obviously it does not release energy in the same way.

That plasma is at less than atmospheric pressure, and is contained in a massive steel vessel. Even if it expands and melts through the vessel, the atmospheric pressure will collapse it. I do not see a plausible scenario where anyone is in jured.

Funnily enough the much bigger danger is magnet quenches. Thousands of amps boiling off a couple dozen tons of liquid whatever is going to be a hell of a danger, but it's still "violent industrial accident" and not "fireworks factory" - and definitely nothing which people think of when you say "nuclear".

My comment wasn't very focused, so I'll clarify: GP seemed to imply that there is some hidden risk about fusion power that nobody is aware of or talking about, as if such a risk is the greatest factor currently bogging down fusion research. I wanted them to expand on that thought, as it's definitely not a perspective I'm familiar with.

"Please consider this a commitment that I will fund fusion energy for any house in Flint that has water contamination above FDA levels. No kidding."

Every time I read news about fusion and all the optimism surrounding it, I always remember this 1983 paper that categorically proves how fusion will never happen on a commercial scale: http://orcutt.net/weblog/wp-content/uploads/2015/08/The-Trou...

Relevant discussion on HN: https://news.ycombinator.com/item?id=20782346

So, forgive me for not feeling any excitement anymore in my life about fusion. For me, fusion is just as meaningless as a flying skyscraper. I've yet to see anyone present a good rebuttal of that 1983 paper.

That's a good discusssion of the technical objections to fusion.

There is also this paper[1] from 2018 which shows that fusion will have higher capital cost than anything else.

(The abstract reads like it was produced by fusion enthusiasts. They were using historical figures for PV and for on-shore and off-shore wind, and using the LCOE method which IIRC uses "overnight construction" assumptions.)

There is also the point that a higher investment hurdle rate is appropriate for fusion as it's an untried risky technology.

Another paper points out that fusion economics are particularly sensitive to unplanned outages. (More modular and dispersed generation methods are of course less sensitive than gas or coal to outages, e.g. plane crash.)

So, fusion had its chance. Now we have cheaper alternatives, which are getting cheaper still.

1. https://www.sciencedirect.com/science/article/pii/S036054421... Approximation of the economy of fusion energy

The "Omega Tau"-podcast does have an episode about superconductivity where they interview a superconductor engineer from ITER [1]. A section of the podcast is about "Quench Protection", which relates to what occurs when the superconductor looses its superconductivity and the energy needs to be removed from the system as fast as possible before it melts.

From the abstract of "Capacitor bank for Fast Discharge Unit of iter facility" [2]:

> In case of failure, in particular quench in a superconducting coil, rather fast and safe energy discharge from the magnetic system is provided by FDU, which breaks the coil supply loop and provides energy dissipation in high energy resistors. The ITER FDU should interrupt currents up to 70 kA with a voltage up to 10 kV.

There are also some slides on the web from CERN about quench protection at the LHC [3] with photos and diagrams in it.

The podcast also has several episodes on fusion "157 – Fusion at ITER" [4] and "312 – The Wendelstein 7-X Fusion Experiment" [5] (and more linked in the descriptions of those episodes).

On a very separate topic, it also has a good one about weather forecasting "326 – Weather Forecasting at the ECMWF" [6]

[1] https://omegataupodcast.net/285-superconductivity/

[2] https://ieeexplore.ieee.org/document/6191678

[3] https://indico.cern.ch/event/194284/contribution/34/attachme...

[4] https://omegataupodcast.net/157-fusion-at-iter/

[5] https://omegataupodcast.net/312-the-wendelstein-7-x-fusion-e...

[6] https://omegataupodcast.net/326-weather-forecasting-at-the-e...

> No other fusion power plant in the world has managed to run for more than ten seconds.

This sentence is incorrect, I guess they meant at 100M degrees because some older tokamaks did run for longer than 10 seconds at lower temperature.

Like the Chinese one that ran for 100 seconds at 50M degrees, but 100M degrees for a short time. https://en.m.wikipedia.org/wiki/Experimental_Advanced_Superc...

South Korea will make the power source and electronics and Japan will make the body of the gundamn. https://japantoday.com/category/entertainment/japan’s-new-li...

Good breakthrough but video on page was troubling to me. The plastic surgery made me feel uncomfortable. Seeing that face way too often.

Fusion has been 10 years away ... for how many decades now. I am younglin when old engulf to heard about the promo use. It is still decades away.

2025 is a surprise target as it is only 5 years away.

Could this be weaponized in the way nuclear energy can be weaponized?

Yes, we call that a thermonuclear bomb. The only problem is that in order to detonate a thermonuclear bomb you need a nuclear bomb to ignite the Fusion reaction first. Civilian/Generative Fusion can never be weaponized the same way nuclear fission has been. Fusion only occurs under the most extreme temperature and magnetic conditions, if they vary even slightly the fusion process stops instantly and you get no energy. We can barely get a energy positive fusion process working in a reactor with all the best tech in the world. Barely. It cannot and will not ever be weaponized the way you think.

Any fusion reactor in this century will likely be putting out a Lot of fast neutrons. These fast neutrons can turn otherwise benign heavy elements into weapons-grade fissile material. There are detection techniques for this kind of breeding and I’d imagine any country capable of building a fusion reactor would also be okay with audits. If you just want weapons-grade fissile material, there are far more economical avenues than a fusion reactor.

Also, if fusion reactors become more viable than fission reactors, then we could put a global ban on all fissile materials. They would serve no purpose except for weapons.

It already is. Hydrogen bombs.


The largest non-thermonuclear bomb was only 500 kilotons yield.

The largest thermonuclear bomb was 50 megatons. 100x.

Any ordinary piece of lumber could be used as a weapon or to build a house.

Unlimited energy makes rail guns really nice.

sshhhh... don't leak our plans for the fusion powered orbital rail gun platform - its top secret!


At how many seconds we will have fusion reactor ?

Ideally indefinitely many, but a few million may be a nice start.

Alright, I'll be that guy. What materials do we have that can contain something that hot?

The main reason the plasma needs to be kept away from the walls is that they will conduct the heat away from the plasma and the hot gas will cease being a plasma.

It has analogies with static electricity. Static electricity might have a voltage of tens of thousands of volts, but it does little damage when you touch someone as the number of electrons (current) is so small that not much energy is involved.

Similarly with a plasma. The temperature (energy per particle) is very high but the number of particles is small, so the total energy is low. Touching something will cause the energy to leak away (cf. static electricity discharge) converting the hot plasma back to cool gas.

None. dboreham has it right -- all the confinement is done magnetically or photon pressure. Stars do it through gravitation.

Details for some of the power supplies used for the superconducting magnets in the first plasma experiment: https://ieeexplore.ieee.org/document/5226386

And details about the magnets: https://www.osti.gov/etdeweb/servlets/purl/20261539

> The vertical load on coil system is estimated to be 330 tons for the coil weight and 320 tons for vertical disruption load.

With some incredible forces that can be generated by the plasma.

> The lateral loads in tokamak could be generated by plasma disruptions and localized halo current flows through the plasma facing components (PFC) or vacuum vessel. The estimated peak lateral load in KSTAR is about 1.3 MN

So what happens if it touches something? Meaning, somehow, the magnetic field fails due to equipment. Would it be like highly corrosive acid going through material or a much more violent reaction?

Not sure about KSTAR specifically, but a lot of these devices use the field both for containment of the plasma and for starting/maintaining the fusion reaction in the first place. If the field fails, the reaction stops.

If it didn't (hypothetically), I imagine it would do some pretty severe damage to the reactor parts specifically. At some point, the vacuum would be broken, air would rush in, and the reaction would _definitely_ stop.

Most of the larger fusors (as far as I'm aware) seem to be refinements of the basic Farnsworth-Hirsch fusor design (https://en.wikipedia.org/wiki/Fusor), which is a relatively simple device. If you have the motivation, you could build one at home with common-ish tools.

It'll probably be minor to medium damage to the containment vessel, but the amount of gas actually at that temperature is very small, and will cool down very quickly once it expands and dissipates.

The heat capacity of the plasma is actually fairly low. So while it might take a but of wall material, it would cool very quickly, electrons would get captures, and you are left with a bit of gas at rather low pressure.

Non-rhetorical: So what's the point of this experiment? Seems like harnessing its energy will not be very effective based on how quickly it dissipates

Think of it just like fire. Fire will also give you almost no energy if you physically scoop it up. Instead you leave the fire where it is, continually generating heat, and you capture what pours off. Or you pump in fuel, ignite it, and exhaust it over and over at a high speed, like in an engine.

There is little energy stored in the gas, but it produces lots of energy. I.e., every second the gas might produce a megawatt second of energy, but only have the heat capacity to store 1 kilowatt second. Numbers may not be to scale but you get the idea. I'm also not sure how this stays stable. I guess that if the gas gets hotter the process breaks down? Mayb the volume expands within the confinement and the reaction rate goes down as a result?

What is actually the plan to harvest the energy (how to get the heat to a steam engine or whatever and generate electricity?)

Deuterium/Tritium fusion produces neutrons that will escape the magnetic fields and hit the surrounding machine producing heat but also damage. This shield around the machine absorbing the neutrons will have a circulating fluid that will remove the heat to run a steam or other type of generator.

We don't really know how to build this part of the reactor right now, but some metals have the ability to self heal and people think this shield should be possible to build that has a decent lifespan. This is a part of fusion reactors that could be researched right now in parallel with other problems, but there has not yet been funding to build a high level neutron source facility to do the research. This is one fact of many that shows, I think, that national governments are not really interested in fusion energy succeeding.

Are the SNS or research reactors not able to simulate the necessary neutron flux?

My father was a plasma physicist that researched controlled fusion and he told me that current neutron sources were not useful for this. I think was a combination of the energy spectrum of the neutrons, the flux power, and the physical size/shape of the target area.

But surely the whole point of this is to eventually create a lot of steam, right? So hopefully it will have enough energy to do that.

Huge electromagnets. Perfectly tuned. With the power budget of a small town.

KSTAR uses superconducting confinement coils. It takes virtually no power for the confining field.

I work on a research machine that uses copper coils. It uses 16x 1000 HP train traction motors with 2000 lb flywheels for energy storage to power the confinement field for one second. :)

16x 1000HP is 12MW, so yes a small town of ~10,000 people. For a few seconds at least :-)

My point is that superconducting confinement coils don't take any power to maintain their field. You just spin them up and leave them there. The power to keep cryogenics running is a hundred-fold less than the electrical power needed for confinement coils.

JET and DIII-D are the largest magnetic confinement fusion research devices made so far and they both use copper coils. They have confining fields of 3.45 T and 2.2 T, respectively. My machine operates at a modest 1 T (for now) and is much smaller. I'm sure the electricity bill that the confining coils that those large machines racked up is substantial.

Forgive my ignorance, but aren't cryogenics and plasma temps at complete odds to function in near proximity?

You're not supposed to let them touch. You won't go to space^W^W^W have fusion today if the magnetically confined plasma somehow escapes its confinement and touches the walls of the plasma chamber, let alone the cryogenics of the magnets.

Besides creating the pressure, that purpose of the magnets is to keep the _few grams_ of plasma away from the walls of the otherwise vacuum chamber.

Look up magnetic levitation, where you keep an object up using just a magnet. Now imagine huge magnets that push against an object from all sides, creating a ton of pressure. (Don't forget to include a whole bunch of fancy math to prevent the object from slipping away like a wet bar of soap you squeeze with your hands.) At no point should the object touch any of the magnets.

Also keep in mind that when you (don't) want to heat something else up it's not the temperature that matters, but the total energy of the object. And the total energy is basically mass x temperature difference. So 1 gram at a million degrees is equivalent to 1 Kg at a thousand degrees, which is a lot less scary.

Thanks for the correction! I did wonder whether it is wise to comment when I don't know anything about that model.

Some designs for fusion walls consider molten sodium walls. Liquid walls have the advantage that they can’t crack, and you can use the absorbed energy to heat water and run an engine.

Crazy strong EM fields to make sure that the plasma doesn't get anywhere near anything!

A magnetic field?

Just buy a bunch of totinos pizza rolls.

In the original spider-man 2, this is what Doctor Octopus is trying to achieve with his metal arms before he goes insane (because... the arms were evil or something?). Perhaps Deep Mind could try outfitting self aware human exo-skeletons to control the magnetic containment of fusion reactions now that they're done with protein folding.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact