Hacker News new | past | comments | ask | show | jobs | submit login
Helion Energy achieves fusion milestone (eetimes.com)
71 points by danboarder 12 days ago | hide | past | favorite | 80 comments

The only fusion project which has a chance of producing excess heat at Q>2 within a decade is the MIT SPARC, using proven plasma physics and scaling the size/cost down dramatically with high-field HTS magnets (think ITER but sooner and >10x cheaper). Why is this definite solution to climate apocalypse being developed within the framework of MIT startup accelerator instead of Manhattan Project while most of the publicity goes to unproven designs which are orders of magnitude from being anywhere close to Q>1 is beyond me

MIT SPARC overview presentation/recent progress: https://www.youtube.com/watch?v=h8uYNhevRtk

Journal of Plasma Physics issue with several papers about it: https://www.cambridge.org/core/journals/journal-of-plasma-ph...

Definite solution? It's a definite non-solution, even if the plasma physics is more nailed down. The ARC reactor (fully scaled up SPARC with tritium breeding blanket) would have a power density 40x worse than a PWR primary reactor vessel, and supplying the world's primary energy demand with them would require 100x more beryllium than the USGS estimated resource (not reserve) of that element.

Could they replace the beryllium with lead? It multiplies neutrons the same way; last I saw, that's what General Fusion was planning to use.

Molten metal flowing past metal structures in their high magnetic field would be a non-starter, I think, due to induced currents and JxB forces.

Does the metal have to flow? Let it sit there and run cooling pipes through it. Every now and then turn off the fusion when you need to fire up the pumps and swap in new lead/lithium.

(Also, I'm dumb but beryllium is also a metal, how does it differ from lead in this respect?)

The ARC design immerses the vacuum vessel in a bath of molten salt (which is where the Be is, in lithium beryllium fluoride (FLiBe) salt). That salt is where the neutrons deposit their heat. Replacing the Be with lead means the heat is getting deposited in that lead (or, more likely, molten lead-lithium alloy).

Even though ARC uses salt, it would also have to worry about voltages induced by flow across magnetic field lines -- not because of currents, but because if the voltage becomes high enough it can induce electrochemical reactions, like production of elemental fluorine (or corrosion of metal where the fluorine would have been evolved.) I think they keep the velocity x coolant channel diameter low enough to avoid that, but it's still a consideration they have to address.

Not sure why lower power density might be a show-stopper here. Beryllium angle is interesting, haven't thought about that

The cost of the reactor will be proportional to its size, so the cost/power will be inversely proportional to the power density. Lawrence Lidsky (who was also at MIT) (and also a similar argument from Pfirsch and Schmitter in Germany) famously pointed out back in the 1980s that DT fusion reactors will inherently have terrible power density compared to fission reactors, and this will render them noncompetitive. Despite putative rebuttals at the time, nothing we've seen since contradicts their devastating argument.



Note that Helion wants to go with D-3He, and use direct conversion for at least some of the energy recovery. This might be the only hope for making fusion compete. But of course you need 3He; making it with DD fusion requires even more aggressive plasma physics.

ARC at least isn't quite as absurd as a tokamak the size of ITER, which has a gross fusion power density another order of magnitude lower.

From Lidsky:

>Fusion will almost certainly have a lower power density than fission and therefore will require a larger plant to produce the same output. Suppose a fusion plant had to be ten times as big and therefore likely ten times as costly — as a present-day fission plant to produce the same amount of power.

Fission is currently not cost-competitive due to the expense of ensuring that fission reactors do not pose an unacceptable risk of radioactive contamination in their vicinity. However, fusion is not subject to this constraint, or anyway suffers from it much less. There are no long-lived radioactive byproducts, and judicious selection of the construction materials (already implemented) can ensure that neutron activation of the walls is not a problem either. Furthermore, the inherently unfavorable nature of fusion reactions mean that criticality accidents ('meltdowns'; a la Chernobyl) are not possible.

From Pfirsch and Schmitter:

>It is shown that the claims made therein for the economic prospects of pure fusion with tokamaks, when discussed on the basis of the present-day technology, do not stand up to critical examination.

The analysis in the fulltext relies on a variety of plasma parameters estimated based on technology available in 1987. I cannot immediately determine if it generalizes to designs using HTS, but the comments on pp 1473-4 about the achievable B field strengths and corresponding betas suggests that they do not. Cf. this paragraph:

>Another possibility is to use higher magnetic fields: 6 T instead of 5 T would increase fw to values between 1.3 and 2.0 MW/m2, which are still very low. The latter comes close to the value of 3 MW/m2 obtained in Sec. IV.A.l from thermal wall load constraints. Higher fields would, of course, again increase the cost.

Overall I don't think that these links provide nearly as strong an argument as you suggest they do.

Fission plants are expensive for a couple of reasons. One is that they need additional layers of heat exchangers. Another is that their parts must be very reliable, to reduce the probability of serious accidents.

Fusion reactors will also require layers of heat exchangers, to isolate the tritium. They will also require very reliable parts: not because of public safety, but because fusion reactors will have so many parts in the hot area where hands-on maintenance is impossible. And this reliability will be expensive, even though the requirement for it is more to avoid a financial meltdown rather than a physical one.

> The analysis in the fulltext relies on a variety of plasma parameters estimated based on technology available in 1987.

The point of these arguments is that beyond a certain power, the plasma parameters become irrelevant. The limit is imposed by what the first wall can withstand, not what the plasma can put out.

If you look at areal power densities of fusion reactor concepts, the older studies had HIGHER areal power densities. But those higher power densities were found to be unrealistic.

Lidsky concluded DT fusion reactors would be an order of magnitude worse (in volumetric power density) compared to fission reactors. In this, he was being too generous: ARC is 40x worse than a PWR; ITER is 400x worse (and DEMO almost as bad).

The arguments there were farseeing, and experience since then has buttressed them, not contradicted them.

>Fusion reactors will also require layers of heat exchangers, to isolate the tritium. They will also require very reliable parts: not because of public safety, but because fusion reactors will have so many parts in the hot area where hands-on maintenance is impossible. And this reliability will be expensive, even though the requirement for it is more to avoid a financial meltdown rather than a physical one.

All of this applies to fission. Radiation equipment is expensive, period. We pay four figures for a block of plastic. A very, very accurate piece of plastic. The components in a fission reactor are not easy to replace either; the cost of a fusion meltdown is the reactor, while the cost of a fission meltdown is the reactor + up to several square miles of the area around it, the latter being so large that we typically ignore the very expensive reactor cost!

But more simply, you're underestimating concrete. Fission facilities are critically dependent on the stuff, wall after wall, being the only material that can be assembled thick enough to guarantee the safety of radiation workers who sit in the plant all day. Lowering the intrinsic radiation burden reduces the use of concrete, which is one of the most expensive parts of nuclear plant construction:


While some of this applies to fusion reactors, it doesn't seem appropriate to compare only the power-generating components of fusion vs. fission reactors while ignoring the safety components when the primary advantage of fusion is safety. Regardless, I've made a note to read more about it.

>Lidsky concluded DT fusion reactors would be an order of magnitude worse (in volumetric power density) compared to fission reactors. In this, he was being too generous: ARC is 40x worse than a PWR; ITER is 400x worse (and DEMO almost as bad).

If the real power densities are available, arguments about the theoretical power density are irrelevant. I can probably build a warehouse ten times the size of a nuclear reactor for a tenth the cost of said reactor. ARC's true power density -- or that of any other reactor -- must obviously be factored into any cost projections. The power density of a particular design is not usually something you need to read a paper about!

A fission plant can be extremely cheap in certain environments. For example, in the cloud tops of Venus, Saturn, Uranus, Neptune, Ganymede, Titan, or Triton, a fission reactor is as simple as a big fabric tube suspended from a balloon, with a naked atomic pile hanging near the bottom, and a wind turbine at the top. Radiation is absorbed by the air inside the (sufficiently broad) tube, which rises through the turbine at the top. You could dispense with the balloon if you constrict the exit aperture just right.

Oops, not Ganymede or Triton. Not enough atmosphere.

But the planets beyond Jupiter have surprisingly gentle gravity.

The assumption that "e a fusion plant had to be ten times as big and therefore likely ten times as costly — as a present-day fission plant to produce the same amount of power" is unreasonable.

Fusion has a lower power density of the reactor itself than a fission reactor, but in terms of size fission reactors are extremely tiny. Nuclear power plants are big because you have a massive containment building around the reactor, and infrastructure both for handling radioactive materials and generating power. Fusion plants don't need the giant containment building and the various additional facilities would be nearly identical for the same level of power production.

Further, costs are not a simple function of size - things much bigger than fission reactors can be built for much cheaper; the problem is that the combination of safety regulations, delays due to public pushback, a lack of standardization, and the loss of a skilled workforce have skyrocketed the price of fission reactors far beyond what the simple engineering considerations would predict. Fusion does not carry fission's stigma, so it should suffer less from excessive regulations and NIMBYism, and engineers can take lessons learned from the history of fission reactors to design fusion reactors that are easily replicated.

If a fusion plant is just like a fission plant, but replaces the fission reactor with a fusion reactor, then it is entirely reasonable to compare the cost of the reactors.

That a fission reactor itself is a small part of the cost of a fission plant doesn't mean the same must be true of a fusion plant. And indeed, if you look at the cost of conceptual DT fusion power plants the reactor is a significant part of the total cost of the plant.

You are right that cost is not JUST a function of size. It's also a function of how exotic the materials are and how intricate the device is. By those metrics, fusion will do even worse. A fission reactor is a rather simple thing, in comparison.

Fusion's costs will be further increased by reliability concerns. The part of a nuclear plant that's too radioactive for hands-on maintenance must be extremely reliable. In a fission power plant, this part is rather small and simple. Multiple fuel rods in a fission plant can leak without necessarily shutting down the plant; a single leak of coolant into the vacuum vessel of a fusion plant will likely prevent it from operating.

Fusion power plants will almost certainly need containment buildings. The cryogens of ITER, for example, would (if fully vaporized) present a larger pressure x volume load than the steam from a fission reactor meltdown. Containing this gas is not cheap. In any case, tritium must be kept from leaking, which will imply expensive hermetically sealed buildings and seals (tritium will permeate through polymer seals.) Tritium will be everywhere inside the fusion reactor building. The tritium that cycles through a 1 GW(e) DT fusion reactor in 1 year is enough to contaminate two months of the flow of the Mississippi River above the legal limit for drinking water. Even small levels of leakage will be extremely vexing.

For ARC specifically, the magnets are shielded by titanium hydride. This material will fully decompose to titanium and hydrogen at the temperature of the molten salt, so it must be assumed that in a serious accident it will all decompose.

It's not unreasonable to compare them, but that comparrison must be made in context: we're saying that something that makes up a very tiny part of the cost will be more expensive, while something that makes up a huge part of the cost will be dramatically less expensive.

Instead of a $100 million reactor, you're looking at $1 billion in reactor spending, but instead of a $4 billion dollar plant that this reactor goes into, you're looking at a $2 Billion plant.

A fusion plant would be comparable to a very expensive fission reactor in a world where people weren't afraid of fission plants, but in that world a fission plant would be dirt cheap. In the real world, fission is way more expensive than the engineering challenges would imply. It's not the materials or the containment that is expensive, it's having all of your assets sit idle for years on end while yet another environmental impact study is conducted.

Also, some of your assumptions are unreasonable. For example the reason you need a containment building around a nuclear reactor is that you can't just vent to atmosphere, because the water contains large amounts of tritium. It's perfectly fine to just vent helium to atmosphere in case of an emergency as it's not radioactive. While a fusion reactor would use a lot of tritium over time, at any given moment the amount present is rather miniscule, it is being actively generated on site specifically and if anything the major technical issue is not having enough. A reactor the size of ITER would have approximately 0.6 g of Tritium in the reactor at any given time, losing all of that to atmosphere would be equivalent to approximately 2% of the annual tritium release from The Hague Nuclear Reprocessing plant. Decomposition of titanium hydride at the temperature of the molten salt is slow, while obviously undesirable, there is no danger in the magnets decomposing, the real issue is quenching, which is one of the few genuine safety concerns of a fusion reactor.

Wait, the idea that the cost of a power plant structure is proportional to its size seems very remarkable to me even within a single technology like light water reactors or gas turbines. My understanding is that generally there's some most efficient size to a reactor or turbine or such because of non-linearities and if you want to increase the power generation of a plant you replicate these most efficiently sized structures rather than scaling them up.

The idea that you could apply the same linear scale to both fusion and fission reactors seems frankly incredible on the face of it. Do you have any details on why this should be so? I don't seem to have access to the second source you listed and the first just made this assertion without explanation. All this isn't to say that I'm sure fusion reactors would have to be less expensive per cubic meter than fission reactors, the opposite seems like it could be a possibility. It's just the idea that we should expect the price to be the same in both cases that I'm finding hard to swallow.

I think high-field HTS tokamaks projected costs are in a reasonable range despite the raw power density being lower. There could be additional savings related to radioactive waste processing since fusion should generate less and also containment/security for similar reasons

Helion and other new fusion projects are, surely, interesting, however tokamaks are so much more ready and, with high-field magnets, likely economical

I don't believe cost projections for fusion. If you look at them, they're filled with assumptions that aren't supported by much of anything(*), but magically make the technology just competitive. As the competition has improved, the assumptions have gotten more desperate. They're less "this is what the technology will cost" and more "this is the least ridiculous set of assumptions we could find that would let our technology not be dead."

If you apply the same level of assumptions to, say, light water fission reactors, I'm sure you'd get cost estimates vastly lower than what they actually cost in practice.

(*) For example, one paper assuming the efficiency of converting thermal energy to power in the fusion reactor is 60%, a level that combined cycle power plants achieve by expanding combustion gas that starts at a temperature that would soften or even melt the turbine blades.

I've seen some claims that even a magically costless heat producing device connected to a steam turbine won't be competitive with solar PV so I'd guess Fusion would also fail that test if it is basically being used to generate heat (note, they're claiming they have some new tech that avoids this problem, but we don't know if actually does or not)

The real question is whether it'd be competitive with solar PV on cloudy winter days with enough battery to get through windless nights. If so, there's probably a place on the grid for it.

(In any case, Helion is planning direct energy conversion, without a heat cycle.)

At that point you're competing against energy storage, not renewables. No way you can afford to run any such power system just a few days a year.

It could all pencil out if you had some massive incentive for production on those specific days and some legally mandated quota for other times, but at that point you'd probably be beaten by a few factories saying they'll shut down for a few days if you gave them the same cash and/or renewables producers using the money to add storage.

Practically, it needs to run full out constantly and compete with an average of other power producer prices to be viable.

Energy storage is not all that cheap.

There's a reason I mentioned "cloudy winter days." The cheapest way to address those is probably to overbuild PV. That overcapacity probably won't be used on bright summer days, raising the capital cost of PV across the board.

I don't think you're going to find many factories that can be economically shut down for entire seasons. None of this is an issue right now because we use natural gas for backup, but we need to stop doing that.

The cheapest way to generate power on cloudy winter days is wind power, not overbuilt solar. Though, yes, overbuilding wind and solar is generally more cost effective than most other alternatives and the overbuilt wind and solar will both contribute power even when not working at their seasonal peak and provide abundant cheap power for storage when overproducing near their peaks.

I had assumed we were both taking that as the baseline alternative, since every country in the world is basically building that out right now, hence my suggestion that demand response would be a better choice than any plausible nuclear option to cover any gaps in that provision due to unseasonal weather which is both less windy and more cloudy than predicted.

But we appear to be starting from radically different assumptions about what power grids will look like (and already look like today)

Yes, we have a lot of wind power, but that also is backed by fossil.

Hydrogen burned in combustion turbines would likely be cheaper. For this use case minimizing capital cost is all important; the cost of the hydrogen itself much less so.

SPARC is great, but I'm under the impression that a gen 1 DT MCF reactor will almost certainly need to be a stellarator to work around the engineering challenges and pulsed nature of tokamaks. Optimization, HTS magnets, and clever coil winding enable them. Tokamaks are easier so SPARC should certainly be made to make its splash.

The real problem is that DoE's Office of Science is relatively reducing funding of Fusion Energy Science. It's barely enough to meet the US' ITER contribution. The very few existing projects are running on fumes. No one in the US is making a machine and hasn't been for over a decade.


People are making machines, it's just with private funding. This includes SPARC, which MIT spun off into Commonwealth Energy. As of a year ago they'd raised over $200M.


The context is US publicly funded projects.

The reason I used this context is because fusion is not profitable, won't be for at least 30 years in the optimistic estimates, and may very well never be profitable. It is exactly the kind of thing that should be public works.

Maybe it should, but since the government is not doing it, we're lucky that investors disagree with your assessment. CE, Tokamak Energy, Tri Alpha, General Fusion, and Helion have all gotten substantial private funding. Tri Alpha was over $700M last I checked. One of Helion's investors is YCombinator.

It's a matter of perspective and wager. I wager that the public image cost of failed startups leads to a reduced likelihood that fusion will be properly funded in the next 100 years. Fusion already has a public image deficit to overcome.

One could be optimistic and say the few potential successful startups such as SPARC or potentially successful moonshots such as Helion will lead to more private investors and/or public funding, but it's a community betting its public image when it's already down. I don't have a safer alternative to suggest.

Not just a public-image deficit. The $billions already poured down that rathole would take decades for the first fusion plant to pay back, if it had to, before ever achieving the break-even that actually counts. Especially so, when running it only at night after cloudy days when the much cheaper wind, solar, and storage flag.

If there was a definite solution it would be funded unless all the scientists involved lack communication skills to raise capital. VCs spend millions on apps that say ‘yo’…

I just finished reading The Wright Brothers by David McCullough. A fascinating aspect of the book is how no one expected two unknowns to solve a problem governments had been pouring money into [1] with no success. Every one of the high profile players of the time were failures.

You may be quick to point out that fusion is different from airplanes - requiring vastly more money and resources. Except that that's also the same argument people thought was true about airplanes in 1900! Anyway, all of this to say if history is our guide than I have my doubts about ITER and the rest of the high profile projects. If fusion is to ever make headway I think it will be from some little known cheap operation in a way no one expects.

[1] https://en.wikipedia.org/wiki/Samuel_Langley#Aviation_work

Here are a couple of small fusion projects that get a decent amount of peer-reviewed publications and respect from other fusion researchers. They're both attempting boron fusion, which is almost completely aneutronic.

https://lppfusion.com/: literally a garage-size experiment. The reactor core is the size of a coffee can and that's big enough for the production reactor. It's a "plasma focus," in which the plasma pinches itself into a tiny ball. Team is half a dozen people, and they have endless engineering issues that a bigger team would probably knock out a lot faster, but they keep at it and put out a detailed progress report every month or so. It's been pretty fun to watch over the years. They've gotten the plasma to boron fusion temperatures (a couple million degrees C), but density is still a challenge.

https://hb11.energy/: A terawatt nanosecond laser hits a target that generates a 4000-tesla magnetic field. Then a petawatt picosecond laser blasts the fuel pellet. It's not like normal laser fusion that compresses a target; it just hits from one side, so hard that it shoves the nuclei into their neighbors behind them. In theory this can kick off boron fusion, and according to some experiments (not entirely by HB11's team) this actually works a lot better than expected due to a cascading reaction.

What's great about that one is that the petawatt lasers have been improving exponentially, at a faster rate than Moore's Law. They're just now getting to the point where theoretically they'd produce net power from boron fusion, with a large energy gain. The lasers aren't even that big or expensive, they're basically tabletop devices in a medium-size room, costing a few tens of millions.

HB11 Energy, the company, just got seed funding, but their lead researcher (Hora) has been talking about this for decades and wrote a book about it, and experiments have been done by various independent researchers around the world.

Thank you, I had not seen these before.

The simple "airplane" they flew was enormously less of a technical challenge than a fusion reactor.

The days of easy, made-in-the-garage inventions are way over. Nuclear fusion reactors will not be created in some garage.

We need serious government money funding many labs and groups or a massive Manhattan project.

We're talking about making a miniature star in a box.

Says someone who will never create an easy, made-in-the-garage invention.

Attitude is everything

Attitude is not everything. It matters, but there are material factors in the world that are hard constraints no matter how you approach the problem.

For example, you cannot simply create a fusion reactor by “really wanting it” and “being smart” and “thinking outside the box”. You have to have all of those traits, plus extensive knowledge of physics and access to rare, expensive materials/tools in order to make a solution due to the tremendous temperatures and complex reactions happening.

The Wright Brothers had comparatively few obstacles in terms of physics. They did not need to acquire a material able to withstand 100 million degrees of heat, for instance. They did not need to understand the mechanics of a fusion reaction, which is orders of magnitude more complex than the mechanics of the Wright Brothers’ aircraft in flight.

Many useful inventions can be created with few resources. Creativity matters. But acting like attitude and elbow grease is “everything” is naive, especially in the context of nuclear fusion research.

In fairness, resources have also scaled. What's a few hundred million when that much only buys a tenth of a Duolingo?

This is exactly why moonshot projects such as this exist. Even though toroidal MCF machines clearly are the most conservative choice and funding levels aren't nearly as high as is necessary to build a reactor in the near future, society says "what's a few million dollars to keep on turning over rocks? There might be a shiny coin under one."

The article uses degrees celsius, foot, watt, nanometer, square foot, kilo electron-volt, miles per hour, teslas ...

Beside temperature for which one could also state the thermal energy for instead, I would only object to the imperial units. All others are needed to describe different physical properties and part of, or derived from the standard unit system („SI units“).

What about neither?

> 1 glass of [fuel] is enough to power a house for 826 years…

We talking 8 ounces? What if I'm extra thirsty?

To some people I know, a "glass of water" means a small glass about 200 ml, to others it's 500+ ml. Also, how much power does one "house" draw?

But it's moot, the point is not that you should take away "826 years" from the article, just "a very long time". Perhaps they should have written "hundreds of years" instead of creating a false sense of precision.

Also: we're talking imperial or US customary fluid ounces?

African or European swallows?

Then you can power a bigger house.

Strange Idea: Perhaps Fusion Reactors have not only a possibility to be the future energy source for Earth -- but the additional possibility as a

future method for transmuting elements -- into other elements.

Put a little bit of Element X (which is not one of the main, energy-generating reactants) into the fusion mix -- get Element Y out of the mix, Z minutes later...

Of course, some of the future challenges to this would include: How to get Element Y, which is now 100 million degrees Celsius, out of the mix, and cool it down...

But, we'll leave those challenges to future Scientists! <g>

Currently, the five main branches of Chemistry are: physical, analytical, biochemical, organic, and inorganic.

Well why not a sixth, future one?

Why not -- (wait for it!) -- "Fusion Reactor Chemistry" <g>


You know, dealing with all kinds of elemental transmutation (and other Chemistry/Chemical Reaction related subjects!) -- that in the future become possible in Fusion Reactors...

Solar power is already cheaper than coal. The sun is the ultimate fusion power source.

"We directly convert fusion energy into electricity" — how does that work?

Various ideas, i don't know what Helion are planning: https://en.wikipedia.org/wiki/Direct_energy_conversion

Touched on here: https://www.euro-fusion.org/news/alternative-fusion-concepts...

The diagram from Helion on that page makes it look basically like a two-stroke petrol engine: a magnetic field compresses the plasma, which then fuses, and the energy pushes back against the field, doing work. I'm not sure my intuitions about lawnmower engines really carry over to fusion reactors like that, though.

Well, of course they do. If 20th century SciFi literature has taught me anything then that there is nothing in high-energy physics that cannot be explained by a metaphor to some good old combustion device or, in exceptional cases, a hydraulic system.

Back when I was looking at fusors (I still have ideas, but no time), there was a suggestion of converting the high-energy charged particles from the fusion into energy by decelerating them in an electric field and then collecting the charge as the speed approached zero; the exact reverse of a particle accelerator putting energy in.

Directly coupling the two processes like that would certainly have some interesting failure modes with regard to potential and phase. If this results in DC power distribution... elephants everywhere will rejoice as Westinghouse's reign of terror ends.

Power transfer from large scale Renewables projects have already precipited HVDC [0].

There isn't an absolutely AC/DC dicotomy as it was in Edison's feud. It all comes down to using the right solution for the right problem.

[0] https://en.wikipedia.org/wiki/High-voltage_direct_current#/m...

Well feeding an HVDC line with something that doesn't already naturally generate AC (like a wind turbine) would be a better fit. But it would be a little strange to not fully exploit the other thing that distinguished the DC system from AC: distributed generation vs centralized generation. What was once necessary due to transmission inefficiency is now desirable due to transmission inefficiency :) (among other things). I guess I'll have to wait a little longer for the breadbox sized fusion reactor, though I wouldn't turn up my nose to a residential fission option.

Yeah. This is something you can do only when the outputs of a fusion reaction are mostly charged particles moving quickly. If most of the energy comes out in neutrons, as in deuterium-tritium fusion which most fusion reactor attempts are using, you have to run the resulting energy through a heat engine to produce electricity.

Which is why Helion is going for direct conversion, given their mostly aneutronic design.

I don't think that idea can be made to practically work, due to space charge limits.

I don’t doubt something will cause problems, given this is 1-20 megavolts at (for a gigawatt reactor) 50-1000 amps, but I’m not sure what issue you’re describing here. Can you elaborate?

Space charge is the accumulated effect of the charge of ions or electrons altering the electric field. It limits current flow in vacuum tubes, and also limits the current density (and hence thrust density) of ion engines.


> (I still have ideas, but no time),

You do have the money?

Easily, fusors are Highschool-project cheap.

That's the headline, but you'd need about $10k of kit if you're handy at digging through trash to put one together.

> This releases energy, and the plasma expands, pushing back on the magnetic field. The change in field induces current and thus electricity to power electrical loads (Figure 1).

As I understand it, they inject two plasmoids into the central solenoid, then compress the merged plasma by increasing the magnetic field there. After a short burn, the magnetic field there is reduced, with the (now hotter) plasma pushing against it. In this last step energy can be withdraw from the magnet and recovered.

The last step is the one I'm wondering about. You can't just put an electric cable near a 7T magnetic field.

I think you use an opening switch to divert the current in the magnet into an external load. And yes all the conductors in the magnet experience strong forces; some 60% of the mass of ARC is the stainless steel support needed to resist JxB forces.

I haven't heard about this company before, on one hand it sound more authentic than the outright cold fusion scam infesting the "trendy tech" scene.

On other hand there are other things screaming fraud:

1. “We directly convert fusion energy into electricity, which means that we don’t require “ignition” and can produce net electricity at much lower net energy [Q] values. Our challenges now are primarily engineering challenges rather than science challenges.”

Bullshit, if they get net energy gain, then there must be a self-sustaining fusion even for a microsecond, even if they are using pulsed operation.

Second, a near 99% certain "engineering solution" to fusion is ITER — the bigger you build a tokamak, the more stable is the fusion plasma. ITER is this "build it big enough" solution.

They don't address unsolved science challenges facing every fusion reactor: material science to withstand the neutron flux, removal of "nuclear ashes," working long term cooling solution, which will not require overhaul, or coolant replacement every few days.

2. One of the key obstacles for many fusion power proposals is that using them will require upgrades to the existing power grid. Helion claims it is side-stepping this issue.

“Helion’s fusion electricity generators are compact, use small amounts of fuel, and can run 24/7. Therefore, one of the key benefits of Helion’s power facilities is that they can directly plug into existing transmission infrastructure and replace current fossil-fuel-based power generation without significant investment in additional infrastructure. Grid-level transmission infrastructure is a requirement of large-scale, gigawatt-class power associated with traditional fusion approaches,”

Bullshit, what will make it harder for a fusion reactor outputting steam to just turn the same turbine as a fossil fuel power plant?

On other hand, they will require way more power conversion equipment to output the high voltage 3 phase AC coming from regular generators.

If a company like this is challenged by such a triviality as electric power conversion, you really should challenge their engineering acumen.

4. “Fusion is an abundant source of zero-carbon baseload power, but unlike fission, fusion cannot produce a runaway chain reaction. If something goes wrong, fusion simply stops. Fusion also does not produce any long-term waste and cannot be weaponized,” said the Helion rep.

A thermal explosion of the reactor casing if the cooling channels are breached/rot/rust, with ejection of neutron activated isotopes may well be possible. Short lived isotopes of light elements are more radioactive, predominantly non-metals, more volatile, and have higher biological affinity than heavier elements. Short term waste may not be killing you over 100 years, but may well kill you in a few days. This may well be worse than conventional reactor meltdown.

5. FRC devices confine plasma on closed magnetic field lines in the form of a self-stable torus. Together with the spheromak they are considered part of the compact toroid class of fusion devices. FRC devices normally have a plasma that is more elongated than spheromaks.

“An FRC is a stable, self-contained plasma which can be accelerated and super-heated to 100 M°C+. Further, FRCs are high Beta, which enables direct electricity recapture and require no particle beams, lasers, superconductors, or antimatter,” said the Helion rep.

Spheromacs been a research subject of USA's, and USSR's national thermonuclear research programs for decades, and been independently concluded as a dead end development by both sides despite the great enthusiasm about inherent advantages.

High, or low Beta will make little difference to the fact that they will invariably need to dissipate double digit percentage of the power output as heat. Having few coils for direct energy capture would probably only mean few percents difference to the net energy output, and I believe will just make the whole thing more complicated, and harder to control.

6. “Our facilities can operate continuously no matter the weather or time of day, which is ideal for baseload power. Moreover, they are compact and can produce 50 MW of power in a 20,000 square foot space.

Bullshit, try to put a conventional powerplant capable of producing 50MW of electricity into half an acre. Even if you remove the biggest part, the boiler, and the support equipment to run it, there is no chance you can physically fit the cooling stacks to dissipate around 150MW of waste heat, turbines, generators, transformers, offices, water treatment plant into 0.5 acres.

7. “Modern gigahertz-class fiber optic triggering, monitoring, and field programmable gate array (FPGA) processing allow the reliable, synchronous, and efficient operation of Helion’s fusion system. At the generator scale, the ability to provide reliable and throttle-able power allows Helion to load-follow existing renewable generation, eliminating the need for new power management or storage technologies,”

This is pure technobabble. This man either doesn't know what he is talking about, or knows, but pretends that he doesn't.

They do address the neutron flux, by not using D-T. Instead they plan a hybrid D-D/D-He3 reaction. Only 6% of the energy output will be neutron radiation, which will mostly be the lower-energy neutrons of D-D reactions. Combined with direct energy conversion I think that addresses all your objections other than feasibility of the plasma physics, for which I guess they disagree with the government. Regarding your seventh point, I thought it was perfectly clear and not "technobabble" at all.

Most frauds don't take the trouble to actually build half a dozen reactors. Helion might fail, but they're clearly making a real attempt.

> Only 6% of the energy output will be neutron radiation,

Then you still have few megawatts of neutrons! More than in the most power dense fission reactors, but without water to protect the reactor parts.

> which will mostly be the lower-energy neutrons of D-D reactions.

2.45 MeV. Which may be better, or worse from the capture, and transmutation side.

On the bonus side, it's just enough to fission U238! It may make sense to just to run it to burn U238, or possibly Th232.

How fortunate they are that they do not need to impress you.

Considering that they have no need to dissipate 150 MW of waste heat, and no need for turbines, generators, offices or water treatment, what else do you imagine they need to fill up their land purchase with?

People investing $millions of their own money are generally able to afford, also, to hire competent people to evaluate the wisdom of their choices. Assuming your off-the-cuff judgment is more competent than of their employees, who spend months studying plans in depth, is a losing game. Better, instead, figure out where you are mistaken.

Contrast this with ITER, where everyone involves gains most by it never working, instead dragging out the proof over decades. With any luck, they can retire before the gravy train runs out of track.

> People investing $millions of their own money are generally able to afford, also, to hire competent people to evaluate the wisdom of their choices.

I'd say, they don't. I haven't seen as many "technical analysts" failing at phys101, and eng101 level matters employed anywhere outside of the so called "VC fund community"

Somebody who made as many trivial mistakes as you did should maybe think twice about disparaging others.

Anyway, the smart money buys smart analysis. Money that can't tell the difference does better aping the smart money; the losers are the ones who only think they can.

> Bullshit, what will make it harder for a fusion reactor outputting steam to just turn the same turbine as a fossil fuel power plant?

Nothing but economics means both are being replaced with cheap, distributed renewables. If your nuclear reactor produces steam then it's already basically useless in terms of power generation as it will cost more to run the steam turbine alone than to build renewables of the same capacity.

This idea is aiming to re-use the existing transmission lines of fossil generators, but not the steam generator itself, something that many renewable and battery projects already do to lower costs.

`If your nuclear reactor produces steam then it's already basically useless in terms of power generation as it will cost more to run the steam turbine alone than to build renewables of the same capacity.`

Can you provide _any_ source for this?! You're saying geothermal and hydro (w/o the environmental issues) aren't competitive with other renewals? They turn turbines as well.

Geothermal isn't competitive with PV and Wind. Neither is coal or nuclear. (Hydro can be used as storage so it's a bit more complicated, but still basically not competitive)

See any recent Lazard's Levelized Cost of Energy Analysis for full details.

For my specific claim about the cost of running just the steam part, look for the golden diamonds on the first chart here:


They mark the marginal cost of running a fully depreciated nuclear, coal or gas plant. Note how you can build new wind and PV for the same price as running the existing turbine based plants.

Those costs still include fuel, but don't include building the structures. So magical, fuel-less heat source powering steam is still going to struggle as PV and wind continue to drop in price.

Geothermal steam turbines cost more to build and operate, but may operate (mainly) at times when solar and wind don't. Hydro turbines are much cheaper to maintain than steam turbines, although dams are expensive to build. Both compete with turbines driven by NG, likewise operated mostly when the much cheaper solar and wind are not producing enough. As storage is built out, demand for geo and NG declines. Their marginal cost grows as the fraction of time they spend idle increases, because their turbine maintenance cost is not strongly dependent on actual usage.

Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact