- Was net energy produced during this time?
- How much plasma was there, ie, how large of a scale was the experiment at?
- How dense was the plasma?
- How repeatable is the experiment? Could it be run again sustainably, or did it fry all the equipment with fast neutron radiation?
The article doesn't seem to mention any of these, and without them, "100 million degrees" is meaningless.
And even if you don't know the context, the article says it's the first in the world to achieve 100M for 20 seconds, good luck doing that in your microwave.
And the specific temperature of 100M degrees might seem arbitrary because it's such a funny sounding amount, but it is actually the precise temperature at which the plasma theoretically starts becoming net positive.
Whether it actually was net positive will obviously be the focus of the researchers for the next weeks, and even if they find the plasma generated energy consistently (which we'd surprised about if it didn't) it still doesn't mean the reactor was net positive as a whole because of inefficiencies.
They will not have fried their expensive reactor intentionally, so if they did that will be sad news. The scale of the experiment is clearly visible from the video. It's a room sized spherical tokamak style device. I'm pretty sure the density of the plasma inside those is a given but I'm not 100% on the physics there.
KSTAR is setting the world record for ultra-high temperature plasma driving every year since raising the plasma temperature to 100 million degrees for the first time in 2018 and maintaining it for 1.5 seconds.
The fusion reactor aims to reach the stage of maintaining 100 million degrees of plasma at KSTAR for more than 300 seconds by 2025. "Most of the physical variables that act as obstacles to commercialization appear within 300 seconds after the plasma temperature rises to 100 million degrees." It means that it is no different from that.” It means that commercial development is possible only after reaching this stage. Director Yoon added, "We plan to maintain it until 30 seconds next year, and then achieve 50 seconds in 2023, 100 seconds in 2024, and 300 seconds in 2025 through facility upgrades."
Can you explain a bit more for the unlearned?
For example, a common way to reach "millions of degrees" is to pump the air out of a container to an extreme degree, then introduce trace amounts of a gas, and then accelerate or otherwise heat those few particles.
If this happened under normal pressures, there would be no container material that could withstand these temperatures. But in a near-vacuum, the collisions between the accelerated particles and the walls can be kept low. In an experimental fusion setting like this, they also employ a magnetic field to keep the plasma away from the container.
Aaand he had to break all of his beakers afterwards: https://www.youtube.com/watch?v=tGqVMbAQhBs
That'll get you in the temperature range. You can also up the voltage to get more exciting effects, fusion is fairly easy to achieve here. Just not net positive fusion.
I cant create 100 tonnes of nuclear waste in my garage either. However, it's not means for celebration when a giant institution does it.
As OP pointed out, with fusion -- net positive power output is important. Not the temperature achieved.
"No other fusion power plant in the world has managed to run for more than ten seconds"
WEST (Formerly Tore Supra) holds the continuous operation of a fusion reactor record time at 6 minutes 30 seconds.
It's always frustrating where teams build fantastic projects but then make false claims that are propagated across the Internet. Don't get me wrong, 20 seconds for a new reactor is good, they'll do longer. After all, the French did in 2003.
However i do agree it is unclear
What's more - the scientific article about the Tore Supra 2003 test is paywalled, but based on the abstract it looks like it was a test of: "simultaneously heat removal capability and particle exhaust in steady-state fully non-inductive current drive discharges" and not a test of maximum sustained temperatures.
This is a great article on measuring fusion progress:
The Nature Papers on ARC show it's almost sure to work. So it's rather disappointing it'll take 5 years to build SPARC, and then another 10 for ARC. Seems to me the primary problems are manufacturing and scaling components like REBCO. Someone like Musk would look at this and go "ok, first thing, let's build our own REBCO factory, and even mine the materials if necessary ourselves to get cost efficiency". They're still approaching this as "big science" project, where they don't want to fail, and they're gonna subsist on grants and as much off the shelf providers as they can.
Given the climate crisis, the government should be doing the same as they did with the CRS for space, award DOE contracts of $X billion for a working reactor, dole it out to people doing SPARC or Thorium Molten Salt (LIFTR) designs.
The innovation rate in Fusion is way too slow, we need to build and fail a lot more rather than sink $22 billion into multi-decade projects like ITER.
To use a physics analogy, gargantuan megaprojects like ITER develop their own selfgravitation that sucks in money and innovation, crushing engineers pitilessly under the weight of the bureaucracy.
To further abuse analogies, megaprojects suffer from something akin to the tyranny of the rocket equation: Because they're huge and expensive, they have to be broken up and doled out to disparate teams (countries even!), subcontractors, etc... Because there are many organisations involved, the friction of the interactions between force managers to plan ahead. Because planning ahead is required, only existing, established technologies can be used.
QED: It is not possible to do "innovation" with megaprojects, their mega size inherently prevents any possibility of true innovation occurring! The bigger they are, the less innovative they are.
I cannot emphasise this enough: Elements of the ITER project have been planned 30 years ahead! That's insanity. That's using 1990s technology in the 2020s! There is no possible path through ITER and then DEMO to achieve commercial power generation before 2050. None. There is not even a hope of such a thing.
Most critically, ITER used a legacy superconducting wire in their designs, which has a lower maximum magnetic field strength compared to more modern types. Because of the highly nonlinear scaling (quartic?) with increasing field strength, this is the fundamental limit to achieving break-even fusion, but they were forced to start planning without it.
They should have done what Tesla did: Focus on the batteries as the primary thing, everything else is secondary. ITER should have focused on the superconducting wires above all else for at least the first two decades of the project, before even thinking of actually opening a CAD program to design a Tokamak!
Part of the role of ITER was to fund the research to develop the reactor's components while giving everyone a concrete goal to work towards. The challenge with ITER is much more than construction.
"crushing engineers pitilessly under the weight of the bureaucracy"
"Elements of the ITER project have been planned 30 years ahead"
"There is no possible path through ITER and then DEMO to achieve commercial power generation before 2050. None"
"ITER used a legacy superconducting wire in their designs"
> Because planning ahead is required, only existing, established technologies can be used.
You can't plan to use nonexisting technologies when that tech (superconductivity) was so very little understood.
Your criticisms seem unwarranted and unverifiable.
That's exactly the problem. No one is allowed to fail so nothing is discovered.
And exactly how is superconductivity so very little understood? It's used in massive projects the world over from superconducting cables to maglev trains to numerous other things. And none of that has been figured out by ITER. ITER is the worst thing for fusion research in decades.
> No one is allowed to fail so nothing is discovered.
2nd para in wiki article: "It [ITER] is an experimental tokamak nuclear fusion reactor" (https://en.wikipedia.org/wiki/ITER) It is an experiment, it was never expected to have certain outcomes.You don't seem to understand that.
> And exactly how is superconductivity so very little understood?
That's not what I said. Here "when that tech (superconductivity) was so very little understood"
Note: 'was'. Not 'is'. (see edit below)
As for the progression in their understanding I'll leave that for a physicist to discuss. You aren't.
Edit: see https://en.wikipedia.org/wiki/Superconductivity#High-tempera... for a list of recent stuff (eg. last 15 years). And stop shouting about how stupid everyone else is.
I generally agree but I think the are examples of some successful projects that did innovate. You could probably consider Gemini + Apollo a single megaproject as they were designed to run consecutively, building towards the moon.
ITER is under MBA/Political management. Even if the management team had all the time in the world, they could not complete the project on their own.
Especially in ground-breaking projects, this makes a huge difference. With many options open, deciding on how to proceed requires cross-subsystem trade-offs and weighing many different risks and challenges against each other: things that will grind any non-technical leadership to a halt as they request piles upon piles of reports from their underlings.
But I’m more curious about your second statement. Could you back that up with some evidence? How is ITER’s management and leadership “MBA/Political”? Is it really different from the other projects’? Such a statement needs to be backed up with some evidence.
Maybe a bit conspiracy minded, but maybe the real purpose is to give nuclear physicists something to do, so that the brains to make nukes are still around in case we ever need to start doing that again?
Gemini and Apollo had a nice one - the space race, more specifically, the fear of being bombed into oblivion from orbital stations without being able to do anything about it.
There's just not much incentive today, it makes business sense to use that funding as long as possible.
And no, Tesla didn’t start by selling batteries: they started by selling expensive cars which didn’t need the batteries at the same scale as mass-produced car; which kind of destroys the meaning of whole of your initial message.
I think the two posters meant that size of the people working on a single thing is negatively correlated with innovation. The analogy was that Tesla is less ground breaking now compared to when it was experimental.
> The corrolary to your message is “don’t build mega-projects”
Mega projects are more optimized for delivery and leaving lesser things on luck and it is beneficial in most of the cases. But posters are saying given the vast scope of things that could be tried in fusion space, mega projects likely won't be touching those.
Instead, you approach the megaproject as a series of prototypes, spending large sum of money in the hope of quick advancement and iteration cycle.
ITER is designed to be a test bed not the final design. Plenty of advancements have taken place after the design was essentially finalized, but none of them actually help test things like heavy neutron bombardment of a lithium blanket.
(*) The fundamental reason is that the power from the fusion reactor must radiate through the wall of the reactor, while the power from fission is transferred into coolant flowing through the reactor. As a result, the volumetric power density will be lower by roughly a factor of (diameter of fusion reactor) / (spacing of fuel elements in fission reactor), for equal power/area through the walls of the fusion reactor and fuel rods. ITER, for example, has a power density 400 times worse than a commercial PWR primary reactor vessel.
Fission is so expensive mainly because of safety and regulatory concerns: radioactivity scares people, and so safety requirements are cranked up to 11: everything is overdesigned just so that current regulator is confident enough (next one might one up anyway), lots and lots of concrete and steel is used for containment etc. Additionally, the whole construction process is hopelessly bureaucratic and takes forever, requiring inspections of inspections, redesigns for things that in, say, coal plants would just be redone differently by contractor on site and so on. We most definitely could make fission significantly cheaper, and in fact it has been so in the past.
Now, fusion does not have similar safety requirements. If the plant blows up for some reason, there's no significant radionuclide contamination. Therefore, there's little reason to require safety standards significantly higher than those for coal plants (the boilers of which can also catastrophically blow up). This makes it significantly cheaper, due to reduced material use and construction time/labor for containment, and reduced regulatory delays.
Your point about power density is also good, but I don't think it matters very much in practice. Sure, fusion reactor might have significantly lower power density than fission reactor, but so what? Renewable plants are even worse when it comes to power density, yet it doesn't stop them from being rolled out. We don't lack space for fusion plants, even if they are much larger than equivalent fission ones.
But in a fusion reactor, all the components also have to be highly reliable, because they will be so radioactive that repair will be difficult or impossible. The stuff that's hot in a fusion reactor will be far more complex than the stuff in a fission reactor, which is technically very simple.
The reliability is costly, regardless of the reason for it.
Of course power density matters. The cost of manufactured items is related to their size, and fusion reactor cores will be far larger than fission reactor cores (as well as being far more intricate). Are you saying this 10-100x larger size will NOT be reflected in the cost?
Renewables have lower power density, but also allow omission of entire parts of a thermal power plant. No steam turbines, no heat exchangers, no cooling systems. They also tolerate far less reliable components, because failure affect just a small part of the overall output of a field of turbines or PV modules.
Tbh I may be way off base but I was really surprised at how much money they spent on things.
I'd would have worked on training the monkey to press buttons the same way they domesticated the fox in siberia, but it would take too long and they didn't want to deal with the hard regulatory stuff.
By the time SpaceX was founded, we've been shooting comnercial rockets into space for half a century, and had flown humans to the moon multiple times. Some application of modern technologies allowed to do the next logical step, a cheaply reusable first stage (expensive reusable space systems existed for decades, like the space shuttle).
Fusion is nothing like that.
We can reliably blow fusion-based nuclear bombs. That's mostly it. We don't have a reliable reactor that would run any close to breaking even (a fusor does not count). We have no experience burning the plasma for more than a few hours, before losing it to internal instabilities. Certainly we have nothing similar to commercial energy generation furled by fusion. A plethora of approaches exist because no common design is known to reliably work.
This is not a moment for a SpaceX. This is not a moment for an Apollo effort. I'm not even sure it's a moment for a Manhattan project type of effort. We still seem to know too little to spearhead an engineering effort. I'd be glad to receive a qualified refutation of my points.
So without a doubt, this is just an engineering problem at this point. A prototype device that didn't have to be maintained, could be built exactly like any other Tokamak, just with REBCO magnets, and you'd have the required fusion power.
To get the required electric power, and add in maintenance, you need to at the FliBe liquid blanket, replacable toroid sections, and a heat exchanger, but again, these are known engineering problems with known solutions, akin to knowing that landing a rocket is possible, but figuring out the grid fins, heat shield, landing legs, and guidance software.
Fusion devices on a small scale have been proving the plasma physics since the 90s. We don't have fusion yet simply because of people being conservative and sticking with traditional "known to work" magnets, which is why ITER is a huge, hulking monstrosity.
As Dr Whyte says, cost scales by R, R can be reduced by a factor of 10 from ITER simply by using different magnets, so ARC will produce the same power as ITER, but significantly smaller and cheaper.
The basic papers have been peer reviewed by Nature, and the science and engineering have been found solid, and thus, the economic arguments as well. Ergo, this is exactly like the SpaceX scenario vs ULA and NASA (e.g. SLS). Small allows fast iteration, fast innovation. ITER's going to spend $20 billion and 20+ years of development just to turn around a realize that the follow on to ITER will need to use HTS magnets and a smaller design.
Nope. The record for Q is .67 set by JET in 1997. The whole justification for ITER is that it would be able to do Q > 1.
Why do you think that net power has been achieved?
The basic problems of power density and complexity have not been solved. Putative REBCO magnets do not solve these problems. Yes, these are engineering problems, but engineering problems are perfectly capable of being fatal. After all, fission has faced only engineering problems since 1942, and yet still has failed.
And this is why fusion is such a boondoggle. It's similar to fission, only even more complex and expensive. It's like the opposite of Keep It Simple, Stupid.
What you're missing is that cost scales with volume. If you use a technology that allows you to reduce your reactor size by a linear factor of 2, you reduce its reduce volume and cost by a factor of 8.
Likewise, while ITER is uber expensive like large PWRs, ARC is 10 times smaller, and a factor of 30 times or more cheaper. It's also not as complex as some fission reactors, the most complex elements being the HTS magnets and the blanket.
This is the problem with ITER and PWRs, they have convinced the public that nuclear energy is inherently dangerous and expensive and impossible to make substantially safer, cleaner, and cheaper, but that is false.
You should look up at the history of LPWRs vs MSRs from the guy who invented them. There's a documentary on the Oak Ridge Molten Salt Reactor experiment. He himself thought LPWRs were too dangerous requiring ginormous size and shielding, but the US government and cronies, preferred LPWRs, and so MSRs were abandoned, despite successful trials.
> Likewise, while ITER is uber expensive like large PWRs, ARC is 10 times smaller, and a factor of 30 times or more cheaper.
The ARC design is smaller than ITER, but it's still very much lower (a factor of 40) in power density than a fission reactor.
Here's some links showing power densities inline with PWRs, or greater:
"Extrapolating from other liquid-to-liquid heat exchangers, roughly 150 MW/m3 is a practical upper limit to the power density of a primary-to-intermediate salt heat exchanger. For comparison, pressurized-water reactor core power density is ~110 MW/m3."
Furthermore, since the power must be extracted by intercepting high-energy neutrons after they have blasted through the plasma focusing apparatus, such a plant would destroy its most expensive parts in a short time, requiring frequent rebuilding -- by robots, because of induced radioactivity at the work site.
There has never been any expectation of practical commercial power resulting from plasma-confinement thermal-neutron fusion projects. Their only plausible budgetary justification is as a jobs program for high-neutron-flux physicists. They are called for by military policy as a population to draw personnel from for weapons work.
Coal has always been cheaper than fission. Solar and wind power costs are much less, and continue falling; and numerous viable energy storage methods, all demanding no fundamental research -- gravity (solid and aqueous), underground and underwater compressed air, LH2, ammonia, methane from captured CO2, even powdered iron -- vie only for which will end up cheapest to deploy and operate.
If a commercial reactor were ever built and made to work, it would take many decades just to break even on the money already spent to date, before bevinning to pay down its own construction -- even presuming it could find a market for its output. In practice, without continued public subsidy as long as it operated, its debt would continue to grow indefinitely.
The only responsible course for the future is to cancel all publicly funded thermal-neutron fusion work, immediately.
For example, in the cloud tops of Venus, an air turbine may be constructed from a very large, balloon-supported polypropylene fabric tube with an air turbine on top as its only moving part, and a no-moving-parts naked atomic pile suspended near the bottom, heating air directly -- by neutrons colliding with atmospheric CO2 -- that rises to drive the turbine. Some of the air would become radioactive, but so what?
In free-fall, a reactor at the end of a long-enough tether would present no risk of environmental contamination or user safety, and so could dispense with both the containment vessel and shielding. Even there, though, pB seems a more atractive goal than thermal neutrons, because destroying your plasma focus apparatus every few months would get tedious.
Moreover, if you can fit a compact reactor (fission or fusion) on a space ship, then you can run nuclear thermal (ISP >800), or nuclear ion engines with extreme efficiency. (>15,000 ISP!) Considering NASA's Kilowatt reactor weights 2tons, assume a dry mass of 10t and a payload of 2%, you end up with a 500t ship (wow, do we even have 498 metric tons of xenon in the world?), but with a delta-v of 500,000 m/s. That gets you to Pluto flyby in 99 days. (I didn't count the Kilowatt reactor fuel weight, I'm not sure how much uranium is needed for a 15,000 Isp burn, I was just interested how much delta-v 15,000 Isp buys you.)
SpaceX starship, if it works, could launch this in about 4 refuelings. (>100-125t to LEO)
Nuclear reactors would be wonderful for use on Titan, btw. Simple open Brayton cycle systems would be highly effective at quite reasonable core temperature.
This is one of the great things about Elon Musk -- he looks at things from first principles. When it looked like he needed 1000 Gigafactories to reach his goal, he went back to the drawing board on decades of battery manufacturing, to come up with the Terafactory.
When carbon composite rockets looked to be too hard to manufacture, and have too many other detriments, despite the clear wins in strength-to-weight, he dropped it and went with stainless steel. I'm pretty sure if he was staring at the basic Tokamak equations, he would have long again saw the economic and time dependence on R, and the massive benefits to improving B, and that would have driven him to look for alternatives.
Too many researchers engage in sunk cost / dollar auctions, doubling down on obvious wrong approaches, because they've already spent so much.
This is also why DeepMind won CASP again, because rather than try to iteratively improve their 2018 result, they dropped it as a dead end and restarted with a completely new approach, even though the 2018 approach was also a revolutionary breakthrough.
What I'm getting at is, being able to let go of your previous projects if they are stalling, with huge cost over runs, and a mediocre future return, is a value we should seek in fusion research, and elsewhere.
(That prescient article has stood the test of time, except that advanced fuels were mostly ruled out also.)
The claim that neutrons will damage the metal structure is also handled by many aspects of the design: easy reparability thanks to modular, segmented magnets, being completely submerged in the liquid blanket, low activation steel for structure, and the pulse design which runs the reactor in a way that reduces overall temperature of the walls.
Plus, he brings up the ole aneutronic fusion dream, except that creating a confined Boron-11 plasma is much more difficult, outside the realm of tokamaks, and typically proposes using laser internal confinement fusion at the Petawatt scale.
More than 20 papers were published on ARC, heavily vetted in journals like Nature, and pretty much all of the peer review concluded the design will work, and if it does work, then it will be much cheaper than ITER, simpler than ITER, and thus refute your paper.
We'll know in about 4 years.
The conclusion he reaches, that fusion reactors will have a volumetric power density at least an order of magnitude lower than a fission reactor, is not contradicted by ARC. The volumetric power density of ARC (from the ARC paper I linked to elsewhere in these comments) is 0.5 MW(th)/m^3 (counting the volume of the entire reactor, not just the plasma) vs. 20 MW(th)/m^3 for the primary reactor vessel of a PWR -- a factor of 40 worse.
His suggestion to look at aneutronic plasmas didn't pan out, as you note, but that doesn't save DT fusion. I'll note that it's his PhD student Todd Rider who shot down the non-Maxwellian schemes that aneutronic fuels would have needed. After that, Lidsky switched to working on fission for the rest of career and life.
We're talking economics of energy. PWR Fission reactors have enormous footprints, averaging 1 square mile. An ARC reactor wouldn't need anywhere need that size. The very nature of using pressurized water means your containment facility has to be a factor of 1000 larger than the reactor vessel to handle a flash boil over.
Current costs to build a 1GW fission plant are $6-9 billion. Predicted costs to build a 200MW ARC reactor are $4-5 billion, so the ARC reactor costs ~5x more at the high end, and 77% more at the low end, for the same output (e.g. build 5 ARC reactors to equal one fission plant). The economics are still viable, and your costs for storage, disposal, and de-comissioning are far lower, as is your land footprint. Note that these ARC costs are for the first reactor, include the sunk development costs, and not the economies of scale that would occur from dropping HTS costs.
> It's all about the limits on areal power density at the reactor walls.
No, he makes lots of assumptions about how the magnets are shielded, how breeding works, and how heat transfer works. All of these are addressed by FLiBe, there's no "structure" to the heat blanket to damage, it acts simultaneously as radioactive shielding for the magnets, heat transfer, and a breeder. The "blanket" in ARC isn't a surface sandwiched with a super structure, it's a volume that's continuously pumped and recirculated.
He also makes the implicit assumption that reactor wall damage is a huge problem. Instead of seeing it as a problem, you should see it as a solution, like an ablative heat shield on a rocket or capsule. As long as the cost of maintenance or replacement is low, and reactor downtime is limited, it is not a problem. Current estimates for ARC, which is designed to be modular, so it can be easily maintained, is that the reactor walls will last about 1 year.
He doesn't even mention the biggest unsolved engineering problem, removing the "ash" (Helium) that accumulates in the plasma.
His article has all the trappings of reusable rocket naysayers in the 80s and 90s, who claimed reusables could never work, because carrying wings, landing legs, heat shielding, and structural reinforcement to make rapid reusability would cut down on your usable payload, and thus doom reusable rockets to always be substantially inferior. And this is because of an obsession with a single variable, like power density, or payload fraction, and ignoring the end to end economics of the entire system, including safety margins, maintenance, footprint, etc.
And that's why I believe ARC will prove Lidsky wrong.
That's per volume of PLASMA, not volume of REACTOR. Since we're talking about the size of the things we have to pay for (the reactor) the latter is appropriate.
> Counting just the PWR reactor vessel, and not the entire containment facility, seems a huge cheat.
Fusion reactors will also require containment vessels, to prevent escape of tritium in accidents (unless you're ok with an accident rendering ground water in a large area around the reactor undrinkable for decades.) The containment vessel will have to be large enough to contain all the cryogens heated to accident temperature. Unlike in a PWR like the AP1000, these volatiles cannot be condensed by water cooling. In ITER, for example, complete volatilization of the cryogens would requires several times the volume of the AP1000 containment building.
The ARC design adds to accident fun by having a large volume of titanium hydride shielding near the molten salt. TiH2 decomposes to Ti and hydrogen gas at 450 C, well below the temperature of the molten salt blanket. Fully decomposed, this TiH2 would release 15 tonnes of hydrogen, occupying 180,000 cubic meters at STP.
Beyond that, the containment building around a fusion reactor will have to mitigate tritium leakage by many orders of magnitude. This will be expensive (in particular, polymer seals cannot be used on penetrations as they permeate tritium.)
ARC will also require a large volume for disassembly of the reactor for vacuum vessel replacement, and for remote control demolition of activated vacuum vessels for disposal. All that volume will become contaminated and off limits to human access.
> PWR Fission reactors have enormous footprints, averaging 1 square mile. An ARC reactor wouldn't need anywhere need that size.
Really good farmland in the US is less than $10K/acre, or about $6M/square mile. Cheap range land is an order of magnitude less; really useless land another order of magnitude cheaper. The cost of land there is a very minor part (0.1%) of the cost of the power plant.
> No, he makes lots of assumptions about how the magnets are shielded, how breeding works, and how heat transfer works.
Again, you have not understood his argument. His argument would apply even if you had 100T magnets made of unobtainium and a plasma configuration with beta = 1. All that he requires is an upper limit on how much energy per area can be transmitted through the surface of the reactor. This is a function of the materials of the first wall, and has absolutely nothing to do with magnets or breeding.
> Predicted costs to build a 200MW ARC reactor are $4-5 billion
If you are referring to the cost estimate from the arxiv paper, that is just the estimated cost of the reactor itself, not the power plant containing the reactor. I will admit I found the estimation methodology in that paper very cursory, so it's possible they were overestimating the cost (80% of the cost was the stainless steel magnet support structure.)
A fission reactor core is much cheaper than that estimate, per MW.
> He doesn't even mention the biggest unsolved engineering problem, removing the "ash" (Helium) that accumulates in the plasma.
And why should he? I mean, why does he have to kill the corpse a third time (after the power density and reliability arguments)?
> And that's why I believe ARC will prove Lidsky wrong.
As I said earlier, the power density figures from ARC show he was right.
I think that's the point. Such ground-breaking innovations like fusion require on-ramp costs with questionable return no private company is willing to invest.
Shooting man in space was prooven decades ago, it seemed natural to commercialize it once there is a business model and be it only collecting government subsidies. Not to deminish the outstanding achievements by SpaceX.
Telsa / SpaceX / Elon - focused on ground up physics principals. They said things like, how can we make a better battery pack and will the math then work for a good EV car. They showed to their satisfaction that the answer was yes, and built from there to a degree. Their battery / powertrain work has been consistently good before any of the huge capital efforts, and the path forward was pretty well defined.
Same with their rocket engines. They iterate and qualify pretty quickly.
ITER and these folks are spending billions on what is actually ancient tech with really no realistic (I don't think) path to sustainable commercial power even theoretically in any reasonable time horizon. This is the Space Launch System approach, gear up and go on old tech.
We do need someone in a lab (Tesla / SpaceX style) iterating, and then proving out key physics / components. The problem with ITER, you CANNOT stop funding it, it's just gobbling up all the money because the number of jobs now tied to this - it's a jobs machine basically.
At spaceX it doesn't feel like their are a lot of sacred cows. These maga projects are pure sacred cow it feels like sometime. It's private / personal capital driven.
We know rockets work inside an envelope we can see, it's a matter of how to do it cheaply. What we know, today, is that fusion doesn't work in envelopes much smaller then ITER.
Besides, ITER is also meant to help develop experience in handling large plasmas for extended periods of time, that knowledge will transfer to other designs.
Also, ITER won't demonstrate anything until 2035. By then, several smaller scale designs would have already demonstrated break even.
Small prototypes don't mean immediate progress in fusion, the original tokamak was hailed as great progress in 1958 and it was less than a metre across.
Until we have perfect understanding of Magnetohydrodynamics and materials science one simply cannot reliably predict the behaviour of completely new devices.
It's like saying a 5nm CPU design is somehow the same as a 32nm CPU, even if they both had the same die size, even those the 32nm versions had far less transistors.
. <-- the point
--> your comment
Arc is doing what JET and Alcator C-Mod did (with a radius very similar to JET), just with a much more power efficient confinement system, so the input power is lower, and the magnetic field is a factor of 2.6 stronger. The basic physics show that stronger fields actually make the plasma more well behaved, not less, so the major variable vs JET (a proven existing tokamak with Q=0.67) that's changing is not likely to lead to surprising results.
Even the Raptor engine, though it was never widely used, was initially built by the Russians decades ago. The concept is known to work.
Hardware-wise, SpaceX did a great job gathering the best of what had already been done and proven. Friction stir welding? Used to build Delta II, Delta IV, Atlas V, Space Shuttle tank and more. Pintle injectors? Lunar Excursion Module. Semi-balloon tanks? Proven on Atlas I and II, and Centaur. Then, once they had something that flew, and a contract from NASA to keep it flying, they iterated on it to make it reusable. Much like the story of Boeing with the KC-135 and 707, a key early contract was enough to support the program and subsidize further development.
They iterated starting from a LOX/RP-1 rocket engine cycle that has been proven to work and in widespread use for half a century. Fusion is totally different.
SpaceX's value add is eventually providing economies of scale.
This has become a popular line of contrarian bunk as of late.
I am an aerospace engineer. I am also a Tesla valuation sceptic. The work SpaceX has done is revolutionary.
In materials, numerical methods, logistics and technology, they have pioneered. Common knowledge in 2001 was reusable launch could not compete with expendable vehicles. SpaceX overturned that paradigm with a fraction of the competition’s cash. Ignoring that is to reduce the problem to a simplistically technical level that detaches entirely from reality.
It was well argued by alt.space community, the kinds of XCOR, Beal Aerospace, Armadillo Aerospace, Scaled Composites etc. that it's not the case. SpaceX is a more successful of this line of groups - and better equipped than many, though not all.
> The work SpaceX has done is revolutionary.
Yes, but mostly not in technology, but execution and optimization in an unusual market area. Success over space dinosaurs tells more about sad state of the competition - though Elon is correct in identifying the deficiency and improving on that.
They had a hypothesis. It was untested and, by any reasonable definition of common expert knowledge, a far bet.
Saying this list’s existence disproves SpaceX’s technological work is arguing home brewers as evidence that Apple did nothing groundbreaking. Or Alberto Santos-Dumont as evidence the Wright Brothers were just incremental inventors.
Yes, they weren’t the only ones on the bleeding edge. But they were right there on it, and doing the lion’s share of the pushing of it.
> better equipped than many, though not all
SpaceX is miles ahead of all of them. Literally, hundreds of thousands of miles of flight time ahead. Ahead in vehicle manufacturing. Number of engineers working on number of problem domains. Hell, out of your list, only Scaled Composites still builds anything.
If we say SpaceX does no science, we must argue that the Japanese and Indian space programs do no science. Which nobody claims because it’s baseless. Designing new motors and engines and control systems and crew capsules and space suits and ultrasonic reëntry systems requires new science, technology and production expertise.
We can say Newton had a hypothesis regarding Earth artificial satellite, which took centuries to experimentally observe. But regarding powered descent, Delta Cliper flew years ago by the time SpaceX Grasshopper tests, and Northrop Grumman Lunar Lander Challenge showed that again. The untested part was to actually land the stage of an orbital rocket - difference in usage, not principal technology. Wouldn't say it was a far bet.
> But they were right there on it, and doing the lion’s share of the pushing of it.
They were practically alone in the commercial space, but technologically their achievements, I think, more incremental that Wright Brothers.
I'm not saying SpaceX did nothing worthy mentioning. It's the case, IMO, of staying on the shoulders of giants when other commercial giants became to lazy to look forward - indeed, the status quo was rather beneficial to Boeing-Lockheed, while Musk was interested in more long range projects. But Musk wasn't operating in vacuum by a long shot. Andrew Beal could probably reach similar results in a slightly different scenario.
> SpaceX is miles ahead of all of them.
"Better equipped" in this context is mostly "had almost a hundred millions to spare on first project development". That kind of money was't available to XCOR, Armadillo Aerospace, Masten Space - but probably was available to Beal Aerospace and certainly to Blue Origin (which is somewhat to the side of this list), maybe to Kistler Rocket... It's not about ideas and achievements, it's about starting conditions.
> Hell, out of your list, only Scaled Composites still builds anything.
It's almost two decades since founding of SpaceX, and this is capital-heavy area, surely only commercially successful survive. Blue Origin is practically only exception. Doesn't tell about technological advantage.
> If we say SpaceX does no science, we must argue that the Japanese and Indian space programs do no science.
SpaceX does applied science, targeted towards rather immediate applications in commercial systems, be that Raptor of Starlink. Here at least we can agree that a degree of science work is required for projects of that kind.
And saying this is about money and resources, and not focus and management, ignores the fact that SpaceX spent less than ULA and Blue Origin to develop 2 rocket engines, 3 rockets, three different recovery systems, and 2 different capsules. Blue Origin has been spending $1 billion a year of Jeff Bezos's money, and despite starting before SpaceX, still hasn't reached orbit or beaten the Falcon 1.
Rocket Lab's Electron made it to orbit with $210m of total funding, and they built their own engine with an electric pump. Far smaller than many other companies.
The reason why SpaceX is so successful is because Elon Musk isn't in it a vanity project, or to make money taking a slice of the LEO sat launch market. He has a vision, call it lunacy, of getting huge numbers of people to Mars. Working backwards from that goal, informs everything they do, which is why they won't settle for an incremental improvement on what ULA or Arianne is doing.
It's like saying if you gave equal money in 2001-2006 to someone to build better mobile phones, they would have produced an iPhone, because most of the components were there. I worked in the mobile market at that point, and I can tell you most people were chasing Blackberry, or imagining slightly better versions of say, an Nokia Communicator 9000, iPAQPhone, or OQO. The iPhone took a vision, and someone willing to tell the naysayers and corner cutters that he doesn't care about their objections, this is what they're doing.
For TEN YEARS on USENET, I read sci.space as people like Henry Spencer and other luminaries decried reusability. They rightly said that anything wings, or landing legs, or extra strengthening and margin you add to have repeated flights comes out of your payload. The major consensus of the time would be to tell Musk his ideas won't work and the physics say disposable rockets are the most efficient.
But marginal cost matters more than Isp or payload fraction, and if you have a system where the majority of your costs are amortized over many launches, lower payload is less relevent, you just do more launches.
Ultimately, SpaceX will prove the naysayers wrong, as surely as he proved the establishment and ULA's allies wrong.
How can we say anything about SpaceX's efficiency with cash if there are no public statements available and even investors have trouble getting any financial data at all?
I was going to say to look exactly at its endless money pit counter-part: SLS from ULA. Nothing will ever compare to that monstrosity that will never work: $7.2 Billion and growing .
Also, not an Engineer, but went to interview at both Tesla and SpaceX and SpaceX's internal facilities are jaw-dropping. People love to go on about Tesla's Fremont factory, which is awesome, but I've been to Fremont and Boca Chica. And Boca is by far more impressive; I lived and worked in Emilia Romanga in Italy, and also lived in the town nearest to Stuttgart in Germany so I've have seen my share of amazing engineer and logistical/distribution feats.
I saw them starting to stack what was to become SN3 (RIP) in real time in the hangar as I saw SN2 vertical on the launchpad which that still remains one of the coolest things I've ever seen in my Life. I was basically leaning up against one of the Raptor Engine prototypes they have pretty much just around.
I've also got a few glimpses of the Hawthorne factory from (the outside) when I was charging the Model 3, lots of cool stuff comes out there (namely Falcon 9 and Merlin and Raptor engines) they have so much cool stuff just lying on the side of the building which I took pictures of but was asked to delete by security, which I complied.
Just one note on that: Boeing is the company building the SLS core stage, not ULA. While ULA is partially owned by Boeing (joint venture between Boeing and Lockheed Martin), they operate independently.
ULA can't quite match SpaceX for style or cost these days, but they are consistently much better at what they do than Boeing is. Their launch success record is extraordinary, and their rockets provide certain capabilities that Falcon 9 and Heavy can't match without adding a kick stage to the payload.
How long ago? Did you get to hang out and eat at the mezzanine?
As for how daunting, maybe it's just because I've been into Motorsports for most of my Life and Aerospace is the most often transitioned to Industry from Automotive I knew (as much as anyone not hands-on with the project could be I suppose) what an immense feat that was. Also I was at the Mars Desert Research Station and got to ask as many questions as I wanted about it, and I'm also a member of The Mars Society.
BUT I still get giddy and have goosebumps going back and watching highlights of the first successful landing as I saw it on the streamcast with a friend who is doing his now doing his Masters in Iceland and wants to be the first wave of Martian colonizers because of that event.
I also saw the first re-landing mission at Vandenburg in 2018, the one that lit up the sky and made people think it was 'aliens' all over SoCal. We saw it from the Eagle's Nest where a bunch of NASA and SpaceX guys watch it from and met one of the non-crew Dragon capsule pilots and you could still get an idea of the scale of it from up there as I've seen the F9, too.
But SN2 at Boca was breathtakingly massive when it stood in its full glory on the launchpad, and to think this too will be reusable but will take Humans to Mars makes feel think the World still makes some sense.
I’d love to see one of the SNs at some point, they look like they dwarf the F9.
Similarly we have good estimates for how much it took to develop Falcon-9, up until first launches. I think we have rough data to make some comparisons.
SpaceX provides audited financials to its (preferred stock) investors. They also highlight them every funding round.
I don't know if it was state-of-the-art but it looked impressive to me as a layperson.
Disagree, the theory of full-flow gas generators, and test bench results, were known for many years. At the same time with fusion we only have experience with uncontrolled one (bombs), or theory of how stars operate (and we can't reproduce that mechanism, not enough gravity). There is a lot more science to discover in fusion than in Raptors.
The analogy holds!
Starting fusion isn't easy, but as I understand it, that's not even the hard part. Containment is the big deal, because of the energy needed.
If you take 800 cubic meters of plasma from the core of the Sun, about the same volume as ITER, it only produces 216 kW - comparable to the engine in your car.
And in that volume, you will have an average rate of 1500 hydrogen fusion reactions per second. Compared to ITER, this is something like twelve orders of magnitude slower. That is why ITER needs to go to much higher temperatures than the sun.
The power density is correct, but the number of reactions is too low by a factor of 10^10...
Getting fusion energy without gravitational confinement is the tricky bit.
You can't iterate your way from fission to fusion, but you can iterate your way from the Merlin to the Raptor.
The Raptor which, by the way, is not even that far ahead of what came before it.
In some sense Earth science had practically do that - from Goddard's 1926 pressure-fed/piston pumps feed via von Braun's open loop of 1944, via Isayev-Melnikov staged combustion of 1949-1959 to Glushko's full-flow combustion of 1967.
Slow, and engines were designed from scratch. But still some iterations and evolution.
Hopefully you (like me) were one of the ones cheering them on, but achievements that seem obvious in reterospect usually have had notoriously shaky statements about how obvious they were.
Obviously the proponents are vindicated today, but it becomes a dodgy game of definitions when talking about how everyone who really had any insight knew that it was inevitable.
What's fascinating about this is how NO ONE was talking about them getting the first stage to be reusable. Musk's plan looks like it exploited this blind spot. Reusing the first stage was so obviously impossible that even with the evidence of nine first stage engines staring them in the face, they thought this was just to minimize development cost. It was only when the legs went on that the penny dropped.
And now, no one else will be able to repeat the free R&D flights that SpaceX got by trying to recover first stages off expendable launches sold at expendable launcher prices.
One really has to be meticulous to stay honest when looking at things in reterospect. And, by extension, when looking at lofty-sounding predictions about the future.
Fusion power still needs to overcome fundamental, hard physics problems.
Talk to someone in physics closer to this. The time scales ITER is operating on are so long term the tech is ancient by the the time they are actually operating it and the path to commercial power is - basically not there I think.
It is a research reactor. It goes first to prove it is at least possible to build a power producing reactor. That's all it has to do. Just exist.
Progress in computer science, and the absurd utility of data handling improvements to other fields, has created an extremely unrealistic expectation in the minds of computer tech adjacent professionals about how quickly technology can actually develop, because the core technology of computing (more transistors) had a long period of exponential improvement due to the nature of how it's produced (lithography on 2D surfaces). Feature size shrinks give you exponential improvements.
But that period is ending for computing - there's no more room left in the silicon chip regime to keep doing it much past 6nm. And progress in other fields enjoys no such advantages: cellular biology for example, can't exponentially increase the time it to takes to grow a culture. They also can't shrink it easily - scaling biological research generally involves just scaling in size - facilities, space, grad students, everything. It is much more linear - there are very few easy wins beyond what improved informatics handling gave us (and where we did get big wins was the same mechanism - the availability of cheap silicon handling led to a rush to look for ways to capitalize on cheap silicon handling).
There's a reason computing is powering better surveillance, but brain-computer interface technology is moving exactly as fast as there are people working in the field, doing experiments, developing technology and doing slow, uncool research like patch-clamping single cells or whatever. Because the fundamentals are diverse, and there is no one single technical improvement which gives you a big scaling improvement (and everyone is hoping they'll find one).
Also Blue Origin actually did propulsive sub-orbital landing from space to Earth before SpaceX, although admittedly from a lower altitude and velocity. Both came after the 1990s era Delta Clipper. Before that there were probably a lot of Apollo era test vehicles.
Each one got a bit higher and faster. This is the value SpaceX offers. They're doing the hard work of making things we know are possible also commercially viable.
They plan to land the 2nd stage of starship, which will have reached orbital speeds. But that is a huge challenge that they have not attempted yet.
The first ones that did, by the way, are the Soviets in the early 70s.
If you mean space probes it’s actually more impressive, the Soviets landed a probe on the moon in 1966: https://en.m.wikipedia.org/wiki/Luna_9
Also since we are already speaking extra terrestrially Apollo 11 landed propulsively in 1969, with humans on board and it got back to orbit!
Compared to space tech, fusion science is at the stage of 1929 sci-fi drama "Woman in the moon".
reusable rockets were uniformly assumed to be a fool's errand.
SpaceShuttle was already partially reusable and operational, just like falcon 9 is, just was not economical.
SpaceX doesn't have a fully reusable vehicle yet, and there was a program to develop a fully reusable replacement for Space Shuttle
If Elon was laughed out of somewhere, thats to do with Elon's credibility at the time.
No, actually the PR was that it was reusable, but the design was not. Each shuttle had to be remanufactured after use, taking up to a year and costing $1 billion.
But it gets better. They had "lost tiles" problems on the very first flight, and never bothered to do anything about it.
What a piece of junk.
That was the second total loss of crew in its ~20 year history.
We also knew that landing rockets was possible, and closed-cycle combustion engines were possible, the physics said so, and even had prototypes that could do it, but with SpaceX, it is not the science that's the issue, or even one-off engineering feats, but practical, economical, scalable production.
With Fusion, we've already achieved confinement times and burns that show we can make it work. But making it work in a prototype reactor is a far cry from making it work in a commercial reactor which needs to be run everyday and maintained. ITER's been going for 13 years now, and I don't think they're solving any of the problems that SPARC will solve.
Yes, tokamak prototypes have provided the insight to help nail down the plasma physics, but small scale fusion prototypes have helped to do, with much faster turnaround time than $20+ billion mega projects.
What SPARC is saying is that small is beautiful. That because of scaling of the B-field to the fourth power, if you focus on stronger magnets, you can dramatically reduce both the size and the cost to build your fusion reactors.
That is, if you double the strength of your B-field, you shrink the size of the machine needed by a factor of 4. So clearly, instead of focusing on ITER, you should focus on the economics and scaling of REBCO manufacturing, and increases in magnetic field you can eeek out because they pay huge dividends.
In other words, we are not longer at the Von Braun stage of rockets, or even the Apollo stage, we're at the SpaceX stage of dramatically reducing manufacturing costs and increasing use.
Likewise, I think Fusion is poised to move past the Von Braun stage of "is this possible" to "can we build a real reactor?" As long as people can let go of huge super-projects, and we can get funding for a bunch of innovative startups in the space to try various techniques and designs for engineering small, cheap, and rapidly.
Likewise, for Thorium reactors, we already know Molten Salt Reactors work, we already know the Thorium cycle works, and we even know a commercial plant works, as Germany ran one at one point. There are oodles of startups now vying to build small scale thorium reactors. The role the government could play here is, like with NASA, the DOE, could provide capital to reduce risk, and get some of the more promising ventures off the ground.
Most importantly, they need to be allowed to fail.
The US military expenditure for fiscal year $934 billion dollars. That is not the actual spend. ITER, over it's entire 20-30 year history will cost about $23 billion USD projected. So a $1 billion or less per year.
By comparison, the US military budget between FY 2019 and FY 2020 increased in proposed (i.e. the actual spend was more) by $63 billion USD.
So the US, somehow, between 2019 and 2020 found the money for another 3 entire ITER projects. Just kick another 3 20-30 year projects out, fully funded.
Uber has raised $24.5 billion USD to date. A company who's principle asset is an algorithm which dispatches taxis, raised enough money to literally fund it's own ITER project.
So "At what cost" - what are these startups needing to be budgeted for, that are going to be so much cheaper, that they are unable to get any of this money? Set aside the actual defense spend of the US - but I mean, how can you, the US runs nuclear powered carriers, and would have infinite utility for a compact fusion power plant - how is it that ITER is always the problem here, and not that this notion of "we need startups" might actually be running into funding issues because on closer examination they're not actually as cheap, the data is not as convincing, or they're actually about to ask for just as much money.
So again, how much money do these alleged startups need? Why is it apparently so hard for them to get it, when the total valuation of ITER is being thrown around by venture capitalists pretty regularly and the prize is dominance of the next several centuries of power generating technology?
No one claimed ITER is consuming their budget, if ITER is consuming anything, it's people, actual scientists in academia working on it, instead of working on smaller scale projects. It's an opportunity cost.
Look, even if ITER is a success, it still won't even produce electricity, that's require DEMO, a follow on project. We're talking post 2035 just for the first Deuterium-Tritium operation. Now, do you think by 2035+, which is 15 years from now, commercial fusion reactors are going to be run off Niobium-Tin cooled with Liquid Helium?
Or do you think commercially available REBCO, which needs only liquid nitrogen, and whose superconductivity doesn't falloff as field strengthen increases, will be used? By 2035, whatever DEMO uses of the ITER design will be 30 years old.
Meanwhile, hundreds of smaller scale fusion devices could have been built, with equivalent field strengths, but at far far less cost.
Would you rather run one massive international experiment with dozens of bureaucracies, or run hundreds of smaller experiences, and collect a lot of data from many more designs in the mean time?
We don't need the kill ITER to fund more startup fusion research in this country. ARPA-E funded Solyndra and Tesla. Solyndra failed, Tesla succeeded, paid their money back, and created a new industry leading car company with a market capitalization larger than all US car companies combined. I'd call ARPA-E and DOE's investment portfolio HUGELY successful. If they were a Silicon Valley VC, all the other VC's would be envious.
There definitely wasn't one 10 years ago, nor were superconducting tapes expected to become viable.
So what exactly do you think anyone would have been building if we were investing in fusion 10 years ago? What would "hundreds" of small scale devices be if we tried to build them today? With a technology that is being kept proprietary and this unavailable for use?
Again: we have a whole bunch of small scale fusion devices. This is what the article here is talking about! And until very recently all of them suggested the same thing: you need an ITER sized reactor to be able to get suitable mean free paths, and that Q is going to be all about the b dynamic manipulation of plasma at scale (which is what ITER is setup to study).
Imagine if the Manhattan project decided to build ICBMs and Hydrogen bombs before they had perfected enrichment. Or if the government decided to fund the full scale model of a National Spaceplane (NASP, remember that?) before they had figured out how to make a SCRAMjet, or store slush-hydrogen properly, or deal with hypersonic heating.
Not to mention, this was purely for power transmission - no one had any idea if they could hold up to mechanical forces of high power magnets, hell, they had a serious problem which was that high temperature superconductors actually would very suddenly stop superconducting if the impinging magnetic field exceeded a critical threshold.
All this was true right up to 2010 - is still somewhat true today because again, there is no scaled up artifact proving it's actually possible to use.
Everyone who wanted to work on high temperature superconductivity was - it wasn't like this was a field which was struggling. And the story at the time - and in fact still, today - was that it wasn't reliable enough and for ITER definitely would be the wrong technology choice for it's goals - which was to build a facility handling large volumes of plasma in a plausible fusion generator environment.
The 5 year timelines being thrown around like gospel here sound utterly absurd on the face of that. A damn CRUD web app takes a year to build, but an experimental fusion plant using essentially nothing but bespoke parts will be done in 5? Cheaply? No.
> The innovation rate in Fusion is way too slow, we need to build and fail a lot more rather than sink $22 billion into multi-decade projects like ITER.
We don't need fusion for climate change; fission will do, and we have working reactor designs already. I don't mean that we shouldn't spend some money on fusion research, just that it isn't reasonable to spend 100% GDP on it (or some significant fraction thereof).
I wish it were the case, but I think we have seen that it is politically impossible.
Now would it be better enough to get out of the same political/cultural hole that fission is in?
I don't know, but it at least has a better chance.
I think the advantage of that is if he did and fusion energy was a large portion of energy that powers the Super Charger Network it would amuse him enough to consider putting his considerable wealth to that end.
He really is like a real life Lex Luthor, and when the only way he can one-up his rivals would be to make them dependent on them at rate Royalties +15% (because he has dominant market share and be because he quite honestly 'just could') would be very befitting after having being out-done in the Space Industry. And the reality is that since Solar city is probably the only real Musk Corp he can actually displace, that's where he'd probably get the most bang-for-his-buck and kick-start the 'battle of futuristic futures' by the 2 richest people on Earth that we were supposed to have already.
Bezos also made lots of platitudes about how he wants to help make the entire Earth a Nature preserve: so in case you're reading this (or iithe Saudi's who have rootaccess to his new hacked his phone...) why the hell not, Jeff?
And to be honest, if they demonstrate the reactor to work, I'm sure others will figure out ways to produce REBCO without violating the patent, or would outright give the patent the finger.
Here is the competition, btw. Paz-Soldan’s technique seems sound and should be funded for prototype coils. I’m not saying MIT shouldn’t be given grants to do this work. I just wish that they wouldn’t use public funds to develop proprietary technology.
Famous last words...
>we need to build and fail a lot more
You know, that actually describes the early years of fusion research. People thought it would be easier than it has turned out to be. There are lots of gremlins...
I will also note that SPARC is a tokamak, the same as ITER. It's decades of steady tokamak research that has gotten us to this point.
It's almost sure to work assuming the magnet works. Given that they are still developing the magnet and haven't finalized the design yet, it seems crazy ambitious to me to get to SPARC in 5 years.
Anyway, assuming your 4000 ton figure is correct, that works out to 125 kW/ton. For comparison, a quick googling suggests a wind power plant is around 3 kW/ton, excluding the rather massive foundation of the tower.
https://arxiv.org/pdf/1409.3540.pdf (page 30)
The net output of ARC in that design is 190 MW(e), which comes to 26 kW(e)/tonne.
It doesn't include anything outside the reactor, which would likely increase the weight by at least an order of magnitude. It doesn't include (for example) the cryo plant, tritium processing, heat exchangers, turbines, cooling system, the BUILDING, and facilities for installation and disposal of vacuum vessels.
https://arxiv.org/pdf/1409.3540.pdf (page 30)
Thanks. As I hadn't seen that paper, I used the Fletcher class destroyer, which was the most numerous class the US built, and it was slightly on the larger side at about 2000 tons; common destroyers were around 1500 tons IIRC. I also assumed, wrongly it seems, that 500 MW was the electrical power.
Anyways, if we're really interested in material requirements for various power plant types, one can check e.g. the DOE quadrennial energy review 2015, Table 10.4 on page 390. One can see that fission does pretty well, about an order of magnitude better than wind (and solar being somewhat worse than wind). I would guess fusion wouldn't do as well as fission, but it shouldn't be a huge difference either considering that except the reactor itself it should be pretty similar.
The Hoover Dam weighs like 120 battleships.
Its just the wrong metric to measure.
But it's also like an Apollo program for fusion installations. It doesn't have to be economical, it just has to work.
"The basic approach being taken by Tokamak Energy is simple—to combine the two aforementioned technologies to unlock the potential of fusion power in more compact devices, and to do whatever development, testing and iteration are necessary on smaller machines in order to reduce construction timescales and to progress faster."
As of January 2020, Tokamak Energy had raised a total of £117 milllion (~$156 million).
I read that SpaceX has raised $5.4 billion . That's the limiting factor.
Spacex took 6 years to get into orbit (and were not the first by decades) and you think they could show energy gain fusion in 5?
I'm starting to think real progress on fusion is suppressed because there might be a risk lurking there.
But I dont believe this is the show stopper why progress is slow. To me it feels more like people know ther e is a technical insurmountable problem ahead (like how to efficiently enough transform the energy to electricity; actually really wondering how this should be scalable) and people involved do not want to reach that point.
I’m not saying it is impossible, but they won’t be working on it until ITERs successor sometime in the 2050s
For one, fusion reactions generate neutrons that activate all the structural material of the reactor, creating radioactive waste. The waste is shorter lived than fission waste but is still a risk.
A more obvious risk is the pure destructive power of that much energy. High pressure, 100 million degree plasma is full of risks. It's nice that fusion reactors can't "run away" like fission does, but you still have to deal with material 6x hotter than the sun escaping confinement.
The specific heat of Hydrogen gas is roughly 20 J/gC at high temperatures. It would be much higher for a plasma but I couldn't easily find a number.
At 100M degrees C, a single gram of hydrogen would have over 2 Gigajoules of thermal energy. 2 grams of hydrogen would have the same energy as a literal ton of TNT.
It's not untenable, but its not really something that should be brushed off as "its only a couple grams of fuel" or "it can't have a runaway reaction".
That plasma is at less than atmospheric pressure, and is contained in a massive steel vessel. Even if it expands and melts through the vessel, the atmospheric pressure will collapse it. I do not see a plausible scenario where anyone is in jured.
Relevant discussion on HN: https://news.ycombinator.com/item?id=20782346
So, forgive me for not feeling any excitement anymore in my life about fusion. For me, fusion is just as meaningless as a flying skyscraper. I've yet to see anyone present a good rebuttal of that 1983 paper.
There is also this paper from 2018 which shows that fusion will have higher capital cost than anything else.
(The abstract reads like it was produced by fusion enthusiasts. They were using historical figures for PV and for on-shore and off-shore wind, and using the LCOE method which IIRC uses "overnight construction" assumptions.)
There is also the point that a higher investment hurdle rate is appropriate for fusion as it's an untried risky technology.
Another paper points out that fusion economics are particularly sensitive to unplanned outages. (More modular and dispersed generation methods are of course less sensitive than gas or coal to outages, e.g. plane crash.)
So, fusion had its chance. Now we have cheaper alternatives, which are getting cheaper still.
1. https://www.sciencedirect.com/science/article/pii/S036054421... Approximation of the economy of fusion energy
From the abstract of "Capacitor bank for Fast Discharge Unit of iter facility" :
> In case of failure, in particular quench in a superconducting coil, rather fast and safe energy discharge from the magnetic system is provided by FDU, which breaks the coil supply loop and provides energy dissipation in high energy resistors. The ITER FDU should interrupt currents up to 70 kA with a voltage up to 10 kV.
There are also some slides on the web from CERN about quench protection at the LHC  with photos and diagrams in it.
The podcast also has several episodes on fusion "157 – Fusion at ITER"  and "312 – The Wendelstein 7-X Fusion Experiment"  (and more linked in the descriptions of those episodes).
On a very separate topic, it also has a good one about weather forecasting "326 – Weather Forecasting at the ECMWF" 
This sentence is incorrect, I guess they meant at 100M degrees because some older tokamaks did run for longer than 10 seconds at lower temperature.
Like the Chinese one that ran for 100 seconds at 50M degrees, but 100M degrees for a short time. https://en.m.wikipedia.org/wiki/Experimental_Advanced_Superc...
2025 is a surprise target as it is only 5 years away.
Also, if fusion reactors become more viable than fission reactors, then we could put a global ban on all fissile materials. They would serve no purpose except for weapons.
The largest non-thermonuclear bomb was only 500 kilotons yield.
The largest thermonuclear bomb was 50 megatons. 100x.
It has analogies with static electricity. Static electricity might have a voltage of tens of thousands of volts, but it does little damage when you touch someone as the number of electrons (current) is so small that not much energy is involved.
Similarly with a plasma. The temperature (energy per particle) is very high but the number of particles is small, so the total energy is low. Touching something will cause the energy to leak away (cf. static electricity discharge) converting the hot plasma back to cool gas.
And details about the magnets: https://www.osti.gov/etdeweb/servlets/purl/20261539
> The vertical load on coil system is estimated to be 330 tons for the coil weight and 320 tons for vertical disruption load.
With some incredible forces that can be generated by the plasma.
> The lateral loads in tokamak could be generated by plasma disruptions and localized halo current flows through the plasma facing components (PFC) or vacuum vessel. The estimated peak lateral load in KSTAR is about 1.3 MN
If it didn't (hypothetically), I imagine it would do some pretty severe damage to the reactor parts specifically. At some point, the vacuum would be broken, air would rush in, and the reaction would _definitely_ stop.
Most of the larger fusors (as far as I'm aware) seem to be refinements of the basic Farnsworth-Hirsch fusor design (https://en.wikipedia.org/wiki/Fusor), which is a relatively simple device. If you have the motivation, you could build one at home with common-ish tools.
We don't really know how to build this part of the reactor right now, but some metals have the ability to self heal and people think this shield should be possible to build that has a decent lifespan. This is a part of fusion reactors that could be researched right now in parallel with other problems, but there has not yet been funding to build a high level neutron source facility to do the research. This is one fact of many that shows, I think, that national governments are not really interested in fusion energy succeeding.
I work on a research machine that uses copper coils. It uses 16x 1000 HP train traction motors with 2000 lb flywheels for energy storage to power the confinement field for one second. :)
JET and DIII-D are the largest magnetic confinement fusion research devices made so far and they both use copper coils. They have confining fields of 3.45 T and 2.2 T, respectively. My machine operates at a modest 1 T (for now) and is much smaller. I'm sure the electricity bill that the confining coils that those large machines racked up is substantial.
Besides creating the pressure, that purpose of the magnets is to keep the _few grams_ of plasma away from the walls of the otherwise vacuum chamber.
Look up magnetic levitation, where you keep an object up using just a magnet. Now imagine huge magnets that push against an object from all sides, creating a ton of pressure. (Don't forget to include a whole bunch of fancy math to prevent the object from slipping away like a wet bar of soap you squeeze with your hands.) At no point should the object touch any of the magnets.
Also keep in mind that when you (don't) want to heat something else up it's not the temperature that matters, but the total energy of the object. And the total energy is basically mass x temperature difference. So 1 gram at a million degrees is equivalent to 1 Kg at a thousand degrees, which is a lot less scary.