Hacker News new | comments | ask | show | jobs | submit login
Successful second round of fusion experiments with Wendelstein 7-X (mpg.de)
531 points by mrfusion 58 days ago | hide | past | web | favorite | 215 comments

Experiments like Wendelstein remind me that the only thing holding back fusion power is money for research like this. Every year, we spend billions on fossil fuel exploration because the return on investment is quick - just think about all the wells going in to extract shale oil.

What if we collectively had the will to invest this much into fusion?

If the Stellerator, or any of its descendants, turn out to be the answer to practical fusion, it would have been contingent on vast quantities of computational power being available cheaply. It does not matter how much money you poured into your Stellerator project in 1970, you could never have built the 7-X, because there would have been no way to produce that design in the first place.

Personally, I think one of the answers to "tech seems to have stalled" is that it's easy to underestimate how important cheap computational power is, even without "AI", to moving forward. For another example, as fantastic as the 1960s space projects may have been, I think they just weren't economically sustainable, so it isn't that amazing that they didn't become a self-sustaining industry right away. I think it's easy to look at SpaceX and say "Gee, there's nothing that we couldn't have done there fifty years ago", but again, I suspect you miss just how much of the Falcon rocket is a result of extensive cheap computation abilities. Even if you could use modern computers to produce a design that could have worked fifty years ago, there's still not necessarily a practical path to that design using only tech from fifty years ago. (And of course the Falcon is full of stuff that couldn't exist fifty years ago.)

Modern tokamak designs are of course run through all kinds of computations nowadays too (because everything is), but the stellerator design, in a deep and fundamental way, simply isn't possible without massive computational power, whereas we've been building tokamaks since before massive cheap computational power.

Pouring in money advances computation. Microprocessors were a direct result of the space race and space weapons systems. No other industry would have justified the trillions of dollars in investment that was required to fit advanced computers into a small space when mainframe systems did the job just fine. Missiles and spaceships required systems that could calculate trajectories and be small enough to launch into space. Had we not invested that as heavily back then our computers would not be as advanced today.

Fusion would be similar.

>> when mainframe systems did the job just fine.

Didn't people see the value that cheaper/more compute power will give them back than ?

This can be seen as a consequence of https://en.wikipedia.org/wiki/Jevons_paradox.

It is called a paradox because seeing it in action usually catches people by surprise. And this is true whether you're talking about the consumption of coal to power factories, or the consumption of electricity to power computing (and in each case the myriad of new uses that efficiency promoted).

In Science Fiction over and over again the trope was of a giant computer that acted like an oracle. You see that in Asimov's work, in Heinlein's The Moon is a Harsh Mistress and so on. Basically nobody anticipated ubiquitous computing. For instance in the Foundation series you see that a computer run by the Second Foundation can predict the future course of history...and people are calculating their courses with slide rules.

As for the killer app of computing, email, I'm only aware of one pre-1970 work correctly anticipating what it would be actually like. (James H. Schmitz has a memorable scene in one of his Telzy stories where she catches up on her messages at a terminal. He doesn't say "email", but the scene is notable for unobtrusively getting it right.)

Good example.

Now compare with Franchise, Jokester, All the Troubles of the World, The Last Question and many more stories that feature as a plot device a very powerful centralized computer named Multivac. (See https://en.wikipedia.org/wiki/Multivac for a more complete list.)

How the same mind could have come up with the idea of androids with positron brains following the three laws of robotics, and yet missed pocket calculators baffles me.

Heinlein predicted electronic calculators shortly after WWII in Space Jockey. The space ship has a little calculator, while the big computer is on a space station. That's more or less in line with late 60s or early 70s tech. I think a lot of science fiction has predicted 20 years out pretty well. It's past that that it usually goes off the rails, only because of the inherent difficulty of predicting the future.

> In Science Fiction over and over again the trope was of a giant computer that acted like an oracle.

In many ways we're doing that exact thing. The computer at my desk is powerful but somewhat useless; the grand oracle in the form of Google et. al. does the real heavy lifting of giving me useful information. Whether it's a single computer or a bunch of separate ones acting as a whole is pretty immaterial in that sense.

"Desk Set" has a job stealing computer with a conversational UI answering reference questions. Ahead of it's time.

People saw plenty of value in more compute - what they didn’t see the need for was paying an astromical amount more for the same amount of compute to be miniaturized.

For example, in 1964, the ATLAS computer went fully online in Manchester, England. It was the most powerful computer in the world, took up a floor of a university, and the word “supercomputer” was invented to describe it.

The requirements for the Apollo Guidance Computer were to make something with those approximate specs, but take up only 24×12.5×6.5 inches (61×32×17 cm), use 55 watts of power and be ready to fly in 1967. It was a crazy, impossible task.

I’m convinced the AGC was the biggest computing advance after the move general purpose computers.

It was only really used for big expensive projects where the expense of the mainframe computer was a small part of the budget, so no.

People from the future will look back at us and ask:

"Didn't people see the value that cheaper/more electric power will give them back than ?"

Not that long ago (1980's?) it was said that no one really needed a personal computer, and that no one would buy one.

I programmed microcomputers in the 70s as a kid.

In the late 70's the rich boy of the neighborhood got a zx-spectrum clone (rich kids toy in Brazil at the time) but I was the only kid of the bunch that thought programming in basic was fun - all of my friends just wanted to load Kung Fu Mater[1] from the tape recorder. Crazy, games were stored into analog audio k7[2] tapes back then.

[1] https://www.youtube.com/watch?v=EgO4HtWIJjQ [2] https://en.wikipedia.org/wiki/Cassette_tape

The Farnsworth fusor was very well within the realm of affordability. Unfortunately, it was not a design that could break even, which destroyed the future of its creator.


Uhh... are you aware that the 7-X design was computed in the late 80s?

It was just an enormous challenge to build this thing. Things like building and validating the field coils required a lot of effort and some novel approaches that they had to come up with along the way.

> 7-X design was computed in the late 80s?

Source for that?

Looking at the German Wikipedia entry, the HELIAS method was invented in the late 80s. I don't see anything about the 7-X design actually having been computed back then. In fact, even the 2002 experiment Wendelstein 7-AS wasn't fully optimised.

Also, Tokamaks were invented in the 1950s. JET started operating in 1983. ITER was proposed in 1987. Given even just a little bit of path-dependency, there really wasn't any realistic overlap.

Another factor in W7-X being possible today and not a few decades ago, from my understanding, is the proliferation of and improvements in precision of CNC tools. All the design computations in the world go to waste if it can't be built to spec. Which it can be, today, at a reasonable cost.

1980's CNC tools did the job just fine. Most improvements since then have been in materials science. Five axis milling machines have been around since the 80's.


This paper published in 92 mentions that the 7-X was being designex at that time. I remember having seen images of the proposed field coil configuration in 91 or 92. So I might have bern off by a few years. But the plans must have bern complete for the project start in 1994.

Here's a source, unfortunately paywalled, from 1989: https://www.tandfonline.com/doi/abs/10.13182/FST90-A29178

I do research at W7-X, so I can confirm that to the best of my knowledge, the design was computed a long time ago. Building the actual device took a while.

Thank you! That's why I love HN :-)

Would you happen to know what kind of compute power was required? Was it run on a colleague's PC or workstation or did it require supercomputer time?

I'm not the expert on this, but I was curious so I looked into it a bit. I don't believe it required significant resources or crazy algorithms, but a clever selection of physics criteria to optimize. It was a ~20-dimensional Neumann boundary value problem [1], and a code named NESCOIL was used to figure out the shape of the coils that would produce the required magnetic field [2], which it did using Fourier series.

By the way, the unusual shapes of the coils can be understood intuitively from this picture: https://imgur.com/a/Bq3ABfQ. A plasma needs to be confined with a magnetic field in order to be heated to extreme temperatures, and a toroidal field (produced by the currents in the red coils) is unstable due to particle orbit drifts. You need to add a twist to the field for it to be stable (using the green coils). But if you unroll the surface of the torus, you can approximate the currents in both green and red coils using the discrete blue coils, and they're easier to build.

[1] https://aip.scitation.org/doi/abs/10.1063/1.860481 [2] http://iopscience.iop.org/article/10.1088/0029-5515/27/5/018...

Do you have any links to a high level description of how this would be hooked up to a powerplant? Would this be used to heat water as per a typical fission plant?

"The Helias (Helical Advanced Stellarator) reactor is based on the Wendelstein stellarator line and takes into account the design criteria of a power reactor."


Alas, it "just" seems to talk about extending the coil design for a reactor, not the rest. I am guessing most of that would be similar as for other fusion reactors, see for example MIT's Pathway to Fusion Energy:


Thanks for the links. Had a bit of a google for Helias. Looks like the Wendelstein team are exploring using something called a "Helium Cooled Pebble Bed Blanket", which was designed for ITER. To me it looks like a closed loop heat exchanger using helium. This would then be fed into a typical steam system.

Though I don't really understand the significance of the "pebbles" in this system.

The pebbles will most likely be made of Lithium because it is a solid/liquid at the operating temperatures of the cooling system and therefore has a high density. It is also an element with a very low atomic mass, which makes the crosssections for interactions with fast neutrons comparably high. So the Lithium will have collisions with the fast neutrons from the D-T-fusion that leave the plasma confinement with most of the energy from the reaction. These will heat up the pebbles due to the energy transfer and from time to time a Lithium-6 atom will catch a neutron. This will make the Lithium core fission into a Helium code amd Tritium core. Tritium is one of the two fuels required for fusion and has a very low natural abundance due to its rather short half life. So breeding it seems to be the best option to obtain it in sufficient quantities.

The W7-X isn't a fusion power plant, so such a description doesn't exist. It's the chemistry equivalent of a test tube, not an internal combustion engine.

The GP's comment

> the stellerator design, in a deep and fundamental way, simply isn't possible without massive computational power, whereas we've been building tokamaks since before massive cheap computational power.

does not square with Wikipedia:

> Stellarator ... The first Model A started operation in 1953 ...

An exact replica of Falcon 9 was impossible 40 years ago, of course. But a replica with 50% lower thrust, 10% lower ISP and 60% it's payload capacity for a similar inflation-adjusted price was certainly possible.

It wasn't the computational revolution that made SpaceX possible, the basic technology is more or less the same as it was in the 60s. The Merlin engine has it's roots in the '90s NASA Fastrac design, evolved for performance and reusability.

SpaceX's success relates to things like market incentives, lean business practices, a fail early and iterate quickly attitude as opposed to "too big to fail" public projects, and so on.

One technology that F9 manufacture uses extensively wasn't even invented until the 1990s.

I refer to Friction Stir Welding, the wonderful technique that welds aluminum alloys by plastic deformation below their actual melting points.

The landing algorithms for the F9 first stage make use of advances in optimization algorithms that weren't available 40 years ago as well (as well as exploiting faster processors).

Both you and I (and hopefully most of HN) know that fusion is safe. But I can't imagine the public perception being great if "fail early and iterate quickly" was attached to anything with "nuclear" in the name.

The main problem that fission has isn't safety, it's cost. Fusion promises to make this main problem worse, not better.

> Personally, I think one of the answers to "tech seems to have stalled" is that it's easy to underestimate how important cheap computational power is, even without "AI", to moving forward. For another example, as fantastic as the 1960s space projects may have been, I think they just weren't economically sustainable, so it isn't that amazing that they didn't become a self-sustaining industry right away.

I think it's also possible that technological process happens in fits and starts instead of at a continuous or even a continuously changing rate. Often there's an individual discovery that leads to a spurt of innovation that consists mostly of applying that singular discovery to different problems.

For example:

* The late 19th and early 20th century saw very fast innovation in vaccines and antibiotics, because the discovery of effective vaccines and antibiotics was general enough to solve entire classes of problems all at once. It seems like a lot of innovations all at once when you manage to find extremely effective treatments against polio, smallpox, measles, syphilis, chlamydia, typhus, typhoid, tuberculosis, etc., etc., etc., but really that was just from a couple of extremely general inventions playing themselves out.

* Steam engines led to large-scale mechanization while electrical power led to small-scale mechanization in two relatively fast and concentrated chunks. In both cases, the invention of a general technique for powering machines led to lots and lots and lots of machines.

* Marconi's wireless telegraph was invented in 1896, leading to different varieties of radio, television, and radar.

* Airplanes might be another example. I don't want to dismiss the huge difference between a Wright Flyer and a Boeing 737, but "make the airplane out of aluminum" and "have a tube that the crew and passengers can be inside of" are both pretty obvious improvements once you've figured out large-scale manufacturing and the Bayer process, both of which we had ahead of time. I guess jet engines were another breakthrough, although we kind of had them already and they're basically just rockets except they consume oxidizer from the air instead of carrying their own.

* Rockets. There were less than three decades between the V-2 and the moon because it turns out "building a big rocket" is the hard part. OK, there are a lot of other hard parts to landing on the moon, but they're hard in the way that they're achievable by any sufficiently well-funded and motivated group of people with rockets, but nearly impossible without rockets. As you point out, Falcon 9 is significantly more advanced in certain ways than Saturn V, but the low-hanging fruit was reaped by Werner von Braun.

* Computers: just like the Saturn V is basically a very, very, very large V-2 and so is the Falcon 9, we're basically using very, very, very large (logically large; physically small) versions of the computers we had in the 1970's. The fundamental discoveries in computing are all basically done; the rest is just playing itself out. The reason it seemed to take longer to play itself out is because computers are a self-compounding invention: you can use computers to design computers, and you can use better computers to design better computers. Also, having computers makes a lot of other things easier, so you get a lot of innovation from that, too.

Fusion, promisingly enough, also seems to be one of those technologies that will just unlock a bunch of really, really powerful innovations, seemingly all at once, because of how many problems we could solve by just throwing lots and lots and lots of energy at them. (Some discussion here: https://www.youtube.com/watch?v=8Pmgr6FtYcY . Also, the excellent book "Sustainable Energy - without the hot air" develops rough upper bounds on the amount of sustainable energy that the UK could produce and the amount of energy the UK consumes, ultimately leading to the bar graph Figure 18.1 here: https://www.withouthotair.com/c18/page_103.shtml. Later discussion of nuclear fusion places the same bar graph, to scale, next to the sustainable level of fusion power production on Figure 24.17 here: https://www.withouthotair.com/c24/page_173.shtml.)

Awesome post! 1870-1970 is an era of amazing core discoveries!

At the same time, it's amazing how computing is helping every single field there. Even if the core discoveries are already made, the efficiency gains in the recent decades are tremendous:

* Classic windmills vs modern aerogenerators

* Analog transmissions vs highly multiplexed, high bandwidth digital transmissions

* First modern passenger planes vs more aerodynamic, more efficient planes

* Straight jet engines vs turbofans

* Saturn V vs Falcon 9 & Falcon Heavy

* Mainframes vs modern computing devices

We live in an era of refinement, where the old discoveries are being improved in ways that enable new uses that were in the realm of science-fiction not so long ago.

Excellent Post!

You should do a Netflix series, Phil!

Plus: GPS, Combustion Engine, The Web / Networking, Plastics/Chem revolution, and maybe soon AI?

There was an old BBC series called Connections that followed seemingly unrelated chains of inventions to their modern day endpoints. Worth finding on YouTube.

Edit: found Wikipedia link: https://en.wikipedia.org/wiki/Connections_(TV_series)

I think you should expand this absolutely excellent post into an article.

A company like Apple has a quarter trillion in cash laying around, they could jump start fusion and control near-limitless energy in several decades. I wish there was more of a visionary drive among institutions for intellectual curiosity and pushing the boundaries of humankind. And yes, I know why they don't... shareholders business plans risk blah blah blah. Reality is just so BORING. Maybe one day we'll find a way to make life's ultimate pursuit something more interesting than an optimization function on some integer in a bank server.

It is odd that you mention this, unless your name is Karl and were at lunch last Wednesday :-), because it came up as something that companies with big cash hoards might actually be thinking about. Consider the notion that at some point a feasible fusion power plant design becomes "known". Then its a question of money to build one or two or 10. And a company like Google or Apple, with a large cash hoard could jump in, build some plants, power their infrastructure on their own plants and sell excess power to the grid, we even gave a name to the company "Google Power and Packet" :-).

The conversation came up because I was trying to fathom why a company would sit on billions of dollars when they could make more billions with it. This idea, that the future will have sudden insights that will take tremendous capital to exploit to create unassailable leadership positions, is not one I had considered. Fusion power, space opportunities, Etc, might be areas where things suddenly create an entirely new need.

Would the idea not be easier if they continue to do what they do, be the largest solar energy producer? Apple Energy should continue to invest and sell Renewable energy where they make sense. And see any excess power to the grid, forcing all of their partners, Mobile Network, (e.g All of Apple iPhone Contract subscribers energy usage are on renewable energy), Foxconn, DHL, TSMC, Memory and NAND Fab, Materials, LCD / OLED Screens, Camera, etc All part of the Apple Ecosystem, from Design, Manufacturer, Assemble, Shipping and Delivery, and iCloud usage, Campus, Back Office, Retail are All on Renewable energy. At least in the form of Credit, that should be the 100% Renewable Energy Goal Apple tries to Achieve. In the scale of Apple manufacturing supply Chain and solar energy pricing, buying Energy from Apple "should" actually be cheaper than non-renewable energy. Imagine the tag line, Every bit of Apple is renewable.

But like everything Apple these days, they are just slow. Their Datacenter expansion already started late, and now with all the set back they are also much further behind. Similar to their Solar Energy which depends and linked to their DataCenter Expansion. And instead of spending money on their CDN ( Which they finally get around to doing so ), built their World Wide WiFi Network, ( iPhone users can enjoy free WiFi access like in Apple Store, but in far more places ), they decided to spend BILLIONS in making TV series and Drama.

Robert Bussard shopped his wiffleball fusor design to Google a few years back in a talk that's somewhere on YouTube. Fascinating stuff.

The conversation came up because I was trying to fathom why a company would sit on billions of dollars when they could make more billions with it

They already are, in the cheapest, riskless way possible: those massive cash stashes are sustaining, or even boosting, their valuations.

When a company grows too big, it's in its best interest, as in investors best interest, not to spend a few billions on an initiative that could lead nowhere.

Ah but cash is dangerous, in a proxy fight it can be used to fund a leveraged buyout. Harder to do with various share classes restricting voting rights but even so.

Consider SpaceX, its a growing business that took what, 10 billion in capital? That is below the 'material' threshold for some of these companies holding 100 to 300 billion dollars in cash equivalents.

I think the point was not that a company like Apple ought to invest their billions into ongoing fusion research, indeed the exact opposite: it might be more rational for a company like Apple to hide their billions under a mattress, so that they can make a single, economy-controlling investment on the very same day that profitable fusion power becomes merely a problem of capital.

on the very same day that profitable fusion power becomes merely a problem of capital.

Wouldn't that be too late? Or, put it this way: it is already a problem of capital. If you have enough, you can explore all technology tree branches till finding the one that leads you to it, even in parallel.

There's no guarantee that any of the technology tree branches have profitable fusion power in them. I'm reasonably confident that controlled fusion with net energy gain is technologically possible. Determining whether it can be profitable is a lot harder. Profitability depends on what every other competitor can do, as well as what your own technology can do. That's why extremely well-proven technologies like coal-fired electrical generators can become unprofitable when competing technologies improve.

If there were a global carbon tax, it would give fusion an advantage against fossil fuels, but that would also be to the advantage of fission and renewables. In the absence of a global carbon tax, is fusion going to be cheap enough to displace fossil fueled power? We've had the technology for emissions-free electricity for decades, but it's difficult to profitably compete with fossil-based generators enjoying unpriced externalities.

Maybe for companies, to aim for fusion is asking too much.

But what about a commercial "DARPA" ?

I'm guessing that the innovations created by DARPA are a very good ROI in the value sense. If not, what is?

But like Xerox-Parc, the fear is that extracting money from that won't go well.

So one solution is creating a monopoly. But that's not ideal.

But what about some sort of insurance ? How would history look if Xerox-Parc was guarranteed to at-least break even?

Your comment made me think of Bell Labs.

From wikipedia: >Bell Laboratories was, and is, regarded by many as the premier research facility of its type, developing a wide range of revolutionary technologies, including radio astronomy, the transistor, the laser, information theory, the operating system Unix, the programming languages C and C++, solar cells, the CCD, floating-gate MOSFET, and a whole host of optical, wireless and wired communications technologies and systems. Nine Nobel Prizes have been awarded for work completed at Bell Laboratories.

Or Xerox PARC. There have been other attempts (like Interval Research) as well.

Always the question of money and focus though. IBM has a huge research arm which is doing amazing things, but their impact on the bottom line is perhaps not as strong as Ginny (IBM CEO) would like? At Sun, when Sun Labs was a thing, there were jokes about "Where good ideas go to die." or something along those lines. Commercial interests have a really hard time seeing value in these things when they are done internally.

For some companies asking for fusion isn't too much. Fusion startups include (among others) Commonwealth, Tokamak Energy, General Fusion, Helion, and TAE (formerly Tri Alpha). The latter has over $500 million invested.

And for a company that's not a startup, Lockheed Martin has a fusion project.

Google is investing in fusion energy: https://ai.google/stories/applied-science/

Unfortunately, the irony is that the only way companies get a quarter trillion in cash laying around is by making investments that they understand. I suspect Apple has no way of making good investment decisions in relation to fusion energy.

This ignores the "disruptive technology" aspect. Consider that most of the largest corporations are very young. Even Apple is only half a century old. 2 decades ago, they were doing completely new things that no one had ever done before. No one "understood" mobile computing before it took off.

Companies get a quarter trillion in cash by doing something really well that others aren't doing. Nuclear fusion powered perpetual motion machines certainly fit the bill...

They just made the Macbook 3mm thinner. That should count for something.

> That should count for something.

It does. You jest, but competition is fierce and little things matter; also, sometimes making even a small improvement require a technological breakthrough.

They have all that money because they make good investments. In the pursuit of fusion, a quarter trillion spent by Elon Musk goes a lot farther than a quarter trillion spent by a lottery winner. It's not just about how much money you have, but who you give it to and for what that determines its value. Just because we are used to indistinguishable standardized consumer products that cost X doesn't mean investing works that way, especially when the certainty of the venture is not assured and is partially the responsibility of the investor to make sure that the money is being directed properly.

I think they would better spend their money on Elysium-style refuge. Fewer people means they wouldn't even need fusion, solar, wind and hydro would be enough.

The problem might be that patents are only valid for about 20 years. That might be the time required to perfect the technology and reach the break-even point.

During those twenty years you'd certainly continue to easily produce more innovation and refined designs that would result in more related patents that would provide a healthy buffer between what you can do and what your competitors are capable of.

There's no time limit on trade secrets and those might be more applicable.

Why not print the money required for this kind of research? why does it have to be tax payer money? or private money?

Err no, money is not "the only thing" keeping us from having fusion power. I can think of one other thing which is that it's damn hard to create stellar-core conditions inside of a small containment vessel on Earth. It's not obvious that throwing more money at the problem is going to change this. After all, fusion research has been funded in some form or another since the 1950s. I'll bet if you added up all the research dollars that have been expended on fusion, it would come to quite a tidy sum indeed. Despite this, we still have little to show for it.

We are spending a relatively pitiful amount of money on fusion power research. The US has spent $29B TOTAL in 2010 dollars from the 1950s through 2010. (I couldn't easily find global numbers, but I can't imagine it's more than an order of magnitude higher)

Some other numbers for comparison:

NASA's current ANNUAL budget is $19B. [1] The BP oil spill cost $40B. [2] "A 2016 IMF study estimated that global fossil fuel subsidies were $5.3 trillion in 2015, which represents 6.5% of global GDP." [3] The US alone spent $600B on fossil fuel subsidies in 2015. [3]

>It's not obvious that throwing more money at the problem is going to change this.

This is true, but we haven't funded it enough. The potential upside is insane. We aren't funding research accordingly.

[1] https://www.quora.com/How-much-money-is-being-spent-on-resea...

[2] http://www.fusionenergyleague.org/index.php/blog/article/hav...

[3] https://en.wikipedia.org/wiki/Energy_subsidies

>>It's not obvious that throwing more money at the problem is going to change this.

>This is true, but we haven't funded it enough. The potential upside is insane. We aren't funding research accordingly.

How much money do you propose we throw at it then because even fission isn't profitable these days and that is something we know how to do. While I agree that it is worth something to pursue, wind/solar + batteries + smart grid are a way better investment these days since we cannot even produce a fusion reactor that is net positive in energy let alone economical.

Just for the record, you don't re-create stellar-core conditions with earth-bound fusion devices: We don't have the means, and the yield would suck.

Now, it is true that there's still fundamental research to be done until energy production via nuclear fusion becomes a reality, but we're reasonably confident that we could make the conventional approaches work.

However, that research is Big Science, and there's no political will to fund it properly (the graph that people like to cite is https://commons.wikimedia.org/wiki/File:U.S._historical_fusi... ). ITER suffered from this as well, and W7-X basically only exists because German reunification happened, and the German government was looking for a big science project - any project - they could leverage to funnel money into East Germany. The plans for W7-X just happened to be ready at the right time.

>>> However, that research is Big Science,

Not all of it. The big science approach is a function of "pure" fusion research done from the assumption that containment/ignition only comes from magnetic fields and particle collisions. Some are approaching it from different directions, even using physical forces to trigger ignition (ie slamming the hydrogen with a big hammer). This isn't crazy stuff, just a more practical approach from people who see the problems from a different perspective.



"At the centre will be a sphere, three metres in diameter, inside which molten lead swirls at high speed creating a vacuum, or vortex, in the middle. Arrayed around it will be 200 to 300 pistons, each the size of a cannon. Firing in perfect harmony, they will create an acoustic wave that collapses the vortex at the very moment a plasma injector shoots hydrogen isotopes, the nuclear fuel, into it. If General Fusion has its physics right, the heat and pressure will ignite a fusion reaction that spins off countless neutrons which will heat the lead even more. Pumped through a heat exchanger, that hot lead will help generate steam just like a conventional thermal power plant."

General Fusion changed their plan last year. The acoustic compression scheme, and the use of a spheromak plasma, are out. They discovered they have to limit the magnetic field to about 100T or else the induced currents start vaporizing the surface of the imploding liquid metal. This would be a disaster, as the metal vapor would have poor electrical conductivity and would penetrate into the plasma.

Confinement in the spheromak was also unacceptable; fast electrons were lost too easily from the edge, causing too much cooling.

Instead, the new scheme compresses more slowly, and compresses a spherical tokamak plasma. This scheme will have a solid post running down the center of the plasma, and the metal will implode from the sides at lower speed. The burn will take about 1 millisecond.

I am skeptical of this approach, since the center post will be exposed to extreme conditions (average neutron fluence two order of magnitude higher than in conventional fusion reactor designs) without a thick layer of liquid metal to shield it. The non-acoustic compression scheme also means the reactor vessel will have to withstand extreme pressure.

Hence my qualifier 'conventional approaches'. I wish those working on alternative approaches the best of luck, and some of the experiments look like a lot of fun (they get to blow things up - I've seen videos where plastering was raining from the ceiling of the control room ;)).

I'd say that the "conventional" approach for triggering fusion is physical. We've been triggering fusion in nuclear weapons for many years and the physicalities of it are very well understood. What general fusion is doing is essentially a non-destructive nuclear secondary. Triggering fusion with magnets, without moving parts, is imho the unconventional way of getting to fusion. It is only conventional in comparison to fusion research. The practical users of fusion (the armed forces) don't use magnets.

>> and W7-X basically only exists because German reunification happened, and the German government was looking for a big science project - any project - they could leverage to funnel money into East Germany

That's the first time I've heard that. Brilliant!

It's wrong, though.

Well, it's wrong insofar that the federal government wasn't actively looking for projects to fund and then miraculously discovered W7-X in a drawer. Rather, W7-X people were looking for funding and shopping for locations, but failed to secure it until they stumbled onto this particular angle to push. It's no coincidence that a new branch of the Max Planck Institute was established in Greifswald, whereas all previous work was done in Garchingen.

At least, that's the story I've been told by one of the current top W7-X people over a glass of beer, though the main topic at the time was that the movers and shakers behind such big projects might not see them to completion due to retirement or even death.

> However, that research is Big Science

Some of the progress in recent years is from the development of ideas that can be tried at much smaller scales, like the ST40:


This is something that spherical tokamak research has been hammering away at for years:


>Just for the record, you don't re-create stellar-core conditions with earth-bound fusion devices: We don't have the means, and the yield would suck.

See Inertial Confinement Fusion, The NIF, High-Energy-Density Physics, etc.

> It's not obvious that throwing more money at the problem is going to change this. After all, fusion research has been funded in some form or another since the 1950s.

That's not very long. The first liquid-fueled rocket was in 1914, and the moon landing wasn't until 1969, and even that was only possible with very high levels of funding far beyond what's invested in fusion power today. Babbage built the Difference Engine in 1822 and conceived of the Analytic Engine in 1837, leaving skeptics dismissing the potential of computers for 150 years. Leonardo da Vinci sketched a helicopter in 1480, almost 500 years before one could actually be constructed. Gunpowder was invented probably before the year 1000 and took over 700 years after that to render pikes obsolete.

Now, it's entirely possible that there's some other limitation holding us back from fusion, just as the lack of internal combustion engines or diodes held us back from helicopters or computers. But I think we've done enough work so far that we'd be able to tell if we were running into something like that. As it stands, I think we're roughly in the place rocketry was before WWII--we have an idea of how to do it, and we don't know if it'll go to the Moon or not, but the only way to find out for sure is to try, and that's pretty expensive and might take awhile.

We do know there are inherent problems with fusion, particularly DT fusion. In fact, we've known for 35 years or more. Even if the plasma physics issues are magically solved, fusion has grave engineering obstacles.



It's interesting examining the recent efforts, public and private, in light of these venerable critiques.

The total since research started is somewhere between $25bn and $50bn.

In the US alone, oil subsidies are >$400bn a year.

Here’s an article - about the same amount, annualised, is spent on Halloween costumes for pets.


Not to mention the total expenditure on fusion weapons.

In 1976 a plan for how much funding over how much time was made to get fusion ready. This graph shows the possible paths, as well as the actual level of funding, and is a great explanation for why fusion seems to always be x decades away: https://commons.wikimedia.org/wiki/File:U.S._historical_fusi...

This graph precisely illustrates my point. Let's conservatively posit that a workable fusion reactor is worth $2 trillion; the world spends many multiples of this on energy each year. The cost of acquiring fusion, according to its advocates, is equal to the integral of any one of the curves on this graph, which looks to be on the order of ~$100b. This is well within the range of investment attainable by private sources, just look at pharma. In other words, if fusion advocates are to be believed, the market is failing to the tune of few trillion dollars, roughly the GDP of the United Kingdom, and has been for decades. Now, markets are not necessarily efficient, but is this a believable scenario? I claim no.

Fusion research is chronically underfunded. We actually have made surprising progress despite basically not funding research at all.

I'm a physicist by training working as a software engineer at one of the large tech companies. I have multiple physics friends doing the same.

We'd gladly be working on fusion (or other physics stuff) but our present jobs pay much better.

More money, and better pay for hard science careers, actually would make a difference.

If fusion research funding got 1'000 dollars a year, we wouldn't have 1'000'000 dollars worth of research after 1000 years. Just as there are diminishing returns after throwing so much money at a problem, throwing too little means that progress will halt and we get stuck in a local minimum because we need a bigger cash infusion to get over the hump.

It sounds like you think fusion would be a great alternative source of energy since you're comparing it to fossil fuel exploration.

However, this is not necessarily true. It not only has to work, it also has to be economical and not have significant drawbacks.

I think that's very much up in the air. Compare it to fission - we know how to build those power plants, but it's just so damn hard that it's too expensive, compared to alternatives like photovoltaics and wind turbines.

And those are getting cheaper and cheaper at an incredible rate at the moment. So it's a moving target.

And yes, solar and wind rely on intermittent sources of energy, so you need to combine them with some kind of dispatchable source/storage to match the demand curve, but you have that too with a fission plant because of the high capital cost, and I don't see why it wouldn't be true for fusion too, unless somebody comes up with a way of building a really cheap fusion plant.

I agree. I hear the comments about a lack of funding but to me it isn't necessarily a lack of vision from politicians and businessmen. I have a Ph.D. in physics and even worked for 6 months full-time in the Princeton Plasma Physics Lab as an undergraduate and I still hav mixed feelings about fusion power, particular vs other non-fossil fuel options.

Among other things, there's a perception of fusion as this clean, infinite energy source. I haven't followed fusion carefully for a good many years but back when I was studying material science, radiation embrittlement of fusion reactor shielding was an area of active research. It wasn't clear that you didn't end up with just as much radioactive waste from fusion as you did with fission. Don't know what the current thinking on that is but it was one practical challenge in the past.

Yes, neutron activation is an issue that needs to be handled safely. But the isotopes produced this way tend to be short-lived, with half lives around a decade. Not the same challenge of storing radioactive waste on geological timescales.

Sure, we might, but the "issue" of management of macro-scale solid radioactive waste has never been anything but nonsense. It is a completely manufactured "problem." Mix it with concrete and cast it into large bricks so that it's hard to lose and infeasible to steal, vacuum seal it in plastic wrap as a precaution against outgassing, drive it to the seaside in lead lined semi trucks and toss it off the continental shelf.

That kind of thinking in the past has lead to toxic consumer fish close to where I live - can't remember if it's mercury or dioxin.

It is actually pretty hard to come up with a solution that's going to be safe for 1000 of years.

I might be wrong but I think all countries so far has opted for the solution of putting the stuff somewhere they can keep an eye on it, and possibly get it up again in case somebody figures out a use for it.

On top of what everybody else has said, I also don't think it's a given, even if/once we crack the technology, that anybody will actually use it to generate power. I think there's a totally plausible outcome where somebody builds a fusion reactor that produces more energy than it consumes and said reactor has essentially infinite, essentially free fuel, but the design they come up with costs $20 billion a pop to build (or whatever) with no clear path to making it cheaper, and just the cost of capital to build new plants makes those plants more expensive per unit of energy generated than coal/gas/solar/etc. even though the fuel costs nothing. At that point, even if it works, maybe the one demonstrator plant they end up building continues to tick away as a singular novelty and the rest of the world carries on building gas plants and solar.

Fission is already in not too different a place than that: fuel is nowhere near the dominant cost per unit of energy produced. I hope that ends up not being the case with fusion, but it's hard to predict at this point.

Even coal power plants cost several billion. I don't think an order of magnitude reduction in cost is foolish to count on.

The issue is the design of extremely specialized components, not necessarily the raw materials (except when they're novel superconducting magnets designed for that purpose).

I think schedule delays are what is making ITER so expensive.

Also Helion, TAE (formerly Tri Alpha), First Light Fusion, Tokamak Energy, General Fusion, and others. TAE is the biggest, with over $500M invested so far.

I have to shake my head at TAE, since their approach was brutally critiqued 20 years ago. It turns out being given $500M doesn't let you evade the laws of physics.


I'm not a fusion scientist so I can't really judge, but in general I'm not sure it makes sense to say "it turns out" unless you know that experiments have proven it correct. From some quick googling it appears that TAE people responded to the critique and other people thought they made some valid points; see page 4 in this paper:


Things seem to be going pretty well for TAE, at least as far as plasma confinement goes. Whether they can get net power from boron is another question, of course. I definitely see them as a dark horse compared to the tokamak companies.

TAE has recently begun pivoting, for example trying to use their technology for cancer treatment(!). Their name change was part of this. Generally a company doesn't do something like that unless they are forced to.

TAE's response was not at all strong, IMO. And that was not the only critique.

Plasma confinement is not the issue for TAE. The problem is even with perfect confinement, their non-Maxwellian scheme doesn't work.

If they had a clear path to fusion and money were the only thing in the way, money would flow into it from every direction.

The problem is that nobody knows how to build a functioning reactor yet or if it's even possible to build a functioning reactor. It's very possible that we could throw trillions of dollars at this and still not figure it out.

It's very possible that we could throw trillions of dollars at this and still not figure it out.

Possible, but unlikely (imo). It's just that private business doesn't really do large-scale multi-decade pure research efforts with far-off ROI, and governments have chosen to spend money elsewhere. Hence, progress is steady, but slow.

Yes. It’s a classic market failure. We see examples of this type of thing throughout history. For instance, without the Manhattan Program, civil nuclear power would not have developed, because the enormous investment in risky technology development would not have flowed from the private sector.

> For instance, without the Manhattan Program, civil nuclear power would not have developed, because the enormous investment in risky technology development would not have flowed from the private sector.

I don't buy this. Generating electricity from fission is stupid simple once you figure out that piling a bunch of uranium produces heat (literally what Fermi did in Chicago in '42).

The safety features to keep the whole thing regulated, not melting, and to avoid poisoning everyone around it obviously aren't simple, but the process of creating and capturing energy as electricity is.

By contrast, the process of capturing the energy and converting it is a huge part of the challenge with fusion.

I am glad that private business is not getting into this. I would hate it if fusion got protected by a bunch of patents.

Fusion power has the potential to be a serious game changer and everybody should have access to it.

Depending on your definition of "reactor", it was either done decades ago, about to be done (or at least the scientific community is very confident they know how to do so), or several decades away.

> What if we collectively had the will to invest this much into fusion?

The big problem is that fusion has reached its end-game in military applications: we have the h-bomb. Sure, there's the hypothetical fusion submarines and carriers, but what do those really provide beyond their fission equivalent?

If we could find some novel thing that fusion power solves for the military, then fusion would be solved.

Fusion in ships is a ridiculous idea. Fission has much better power density, and volume in ships is limited.

> What if we collectively had the will to invest this much into fusion?

We should invest much more heavily into fission, and keep deuterium/tritium fission investment at around the current levels.

Why? Because deuterium/tritium fission like the stellerators and tokomaks use produces a lot of (energetic) neutrons, along with heating the plasma. That is very bad for two reasons. First, essentially unrecoverable energy is wasted by the neutrons. Second, the neutron exposure causes the reactor to become radioactive. So, the promise of "clean" fusion energy with no radioactive waste is not realized.

Aneutronic fusion is possible, but requires higher energy and different fuel. This is not yet being widely researched, but some interesting work is being done at LPP Fusion: https://lppfusion.com/

That is an effort I'd like to see funded much more fully.

At any rate, next-generation fission reactors are just as safe as deuterium/tritium fusion, are well understood, and many designs produce less and shorter-lived waste than the dinosaur PWR designs. It is beyond silly to not fully leverage clean, cheap fuel with a million times the energy density of fossil fuel while we work out the kinks with fusion. There is a necessity for reliable grid power alongside unreliable "renewables", and fission is the only viable CO2 free approach for the foreseeable future.

Dont forget about the advances in superconductor tech happened over last few decades. Especially with High tesla superconductors possible due to new materials like REBCO tapes have been a great enabler. so much so that the ITER is design locked to use the older tech and cannot easily upgrade to the latest materials. Prof dennis whyte of MIT ARC project actually had a slide explaining the impact of max magnet strength on the containment design and overall cost of the project.

My key takeaway from this is that market/capitalism are really good at finding the optimal solution to an engineering problem using the currently available tech, however, its not so much for cases where you need the enabler tech also to be invented. IMHO this is where govt needs to step up and bring all the necessary pieces in the realm of possibility. after this private business can pick it up and optimize.

on a different note, its really amazing that they can maintain superconductors at -100 F just a few inches from a plasma at 100M F.

>What if we collectively had the will to invest this much into fusion?

If it were to turn out that fusion power isn’t technologically feasible within useful parameters (currently achievable technology, useful scales, economic feasibility of recovering the costs in the lifetime of a reactor, etc), we’d have squandered vast resources pointlessly.

For a start there are dozens of different approaches currently being pursued. Which do we bless with the billions of extra funding? All of them? Suppose none of them pan out?

It’s not like fusion isn’t being actively investigated and invested in. It is, to the tune of tens of billions of dollars. How about we see how those projects pan out in practice and then progress from there?

> If it were to turn out that fusion power isn’t technologically feasible within useful parameters (currently achievable technology, useful scales, economic feasibility of recovering the costs in the lifetime of a reactor, etc), we’d have squandered vast resources pointlessly.

Not necessarily, because discovering a large blocker would give us an idea of what to invest in next.

The economic impact of fusion power is great enough that even if there's a 10-25% probability of it working out, the expected value implied by that probability would justify significantly higher levels of funding than fusion currently receives.

> It’s not like fusion isn’t being actively investigated and invested in. It is, to the tune of tens of billions of dollars. How about we see how those projects pan out in practice and then progress from there?

ITER, the largest multinational fusion project, is projected to have a total cost of over 20 billion euros, which is about 22 billion USD. 22 billion USD is less than NASA's budget, adjusted for inflation, for the single years of 1963 through 1970 or 1990 through 1993. And NASA's work is built on top of the rocketry work of Nazi Germany, which cost about 40 billion inflation-adjusted dollars even with, to be euphemistic, artificially low labor costs.

For another point of comparison, the Persian Gulf War, which was primarily motivated by the attempt to secure part of the global supply of petroleum, cost $61 billion. The Iraq War cost at least an order of magnitude more, but (a) many of that was due to cost overruns and (b) that war was partially motivated by issues other than securing part of the world's petroleum supply.

In terms of improving long term human living conditions and enabling future economic growth, developing fusion power would provide tremendous benefits.

Discovering a dead end is not pointless from a science point of view. It's bad return on investment, that's for sure.

Fully agree but the same could be applied to solar and energy storage as well.

We have invested billions into fusion. And we get results like this, which are ... basically useless.

If this ends up being the right design for fusion, it may be the most weirdly shaped human artifact, where the shape is fundamental to its operation and not decorative. It's far more complex than a mere turbine blade or venturi. Or can anyone suggest a weirder one? (Define weirdness as the Kolmogorov complexity of the shape, within the precision needed to work properly.)

David Gelernter has a book on (mostly? not sure if I remember correctly) industrial and software design, in which he brings up the bakelite telephone as an example of such an object. Weird shape - check, functional - check (the handset's size and placement matches the distance between the ears and the mouth). It became a literal icon of itself.


https://en.m.wikipedia.org/wiki/Evolved_antenna looks weird. Source 1 has another image of a more complex antenna.

Both images for lazy: https://i.imgur.com/ROhfMXb.jpg

> Define weirdness as the Kolmogorov complexity of the shape, within the precision needed to work properly.

Well, if the shape is the output of an optimization program, as long as the program itself and its inputs can be simply specified, the shape actually has low Kolmogorov complexity.

True, but you can still formalize the sense in which the object is weird in terms of how long it takes the optimization routine to run. See Bennett's notion of Logical Depth: https://en.wikipedia.org/wiki/Logical_depth

Does a CPU count? The functional part is a very precise, complex and intricate shape.

CPUs also have some iterative weirdness in that successive generations do, to a certain degree, depend upon the computation of the previous iteration to calculate the layout.

I nominate the latest Ubuntu Desktop or Windows install DVD for title of the most complex item in the known universe.

(For comparison, the human genome is under a GiB.)

The weird design makes me as a layman skeptical of the tech's potential, but I'd love to hear why I'm wrong:

- The randomness and lack of symmetry suggest that we are lacking insight into the problem, like an inelegant physics explanation or an over-fitted model. It's not just that it's complex, but it doesn't even seem to have "parts" that work together. Conceptually fusion seems pretty simple: why in principle does harnessing it require super-human complexity?

- If such extreme contortions and micro-optimizations are required to get this far, how much potential can there still be on this path? In software, if you are replacing divisions with bit shifts, maybe you really need a different algorithm. I understand this is just a research project, but if everything is so finely tuned already, how do you make improvements?

I don't know anything about fusion power, but those are the intuitions I feel when I see the plasma vessel photo and read how it was designed.

> The randomness and lack of symmetry suggest that we are lacking insight into the problem

No, it doesn't.

> Conceptually fusion seems pretty simple: why in principle does harnessing it require super-human complexity?

Because having 2e30 kg ball of hydrogen within a powerplant is extremely unwieldy here on Earth.

> In software, if you are replacing divisions with bit shifts, maybe you really need a different algorithm.

Huh? If your algorithm says "divide by 4", why would you need a different algorithm.

Plasma dynamics isn't simple. You're trying to confine fast-moving charged particles with a magnetic field. Those charged particles generate their own magnetic field when they move, and that field creates forces on all the other moving charges. Most configurations are unstable, because tiny asymmetries in the shape get amplified.

why state 'I don't know anything about fusion power' but expect that fusion is simple.

nothing is simple, nothing is elegant. Everything is hideously complex, we are just standing on the shoulders of giants. e.g. Try making common things that we take for granted, like building a plane or glider or even writing a text editor from scratch (including the font rasterization code and display driver).

It's extremely symmetrical!

> Conceptually fusion seems pretty simple: why in principle does harnessing it require super-human complexity?

All your useful materials melt at 2000 degrees.

The fusion takes place at 20000000 degrees.

Mystery solved.

(Yes I know these numbers are grossly oversimplified.)

With fusion devices having trouble to confine the plasma for long times, I wonder if a massively-parallel fusion plant woud be feasible.

Let's assume that plasma destabilisation does not damage the device, and is a mundane event.

Build 10 or even 20 fusion devices (economy of scale!) feeding the common heat buffer, e.g. a large reservoir of a molten salt or metal. Feed conventional turbines off the heat of the heat tank. The tank evens out the input power jumps.

Now we can restart the fusion in every fusion device every so often, provided that restarting it is made a mundane operation, too. It, of course, takes a lot of electricity to pump into the magnets. Conveniently, we have a mighty power plant right here. Dumping the magnetic/electric energy from the magnets requires a huge sink. Luckily, we already have such a sink co-located.

Building the plant takes a massive investment. Luckily, the architecture allows to build it piecemeal, feeding the next added unit with the power of the already built units.

BTW the waste heat could be directly reused in some kind of chemical processing, like smelting, or maybe even synthesis of carbohydrate fuels from ambient carbon dioxide and water.

Ok, several issues with this. First restarting these devices takes a lot of electricity, which drastically reduces your net efficiency. So, you still really want to solve this problem.

Second, fusion devices scale very well so one device at 10x the scale is vastly better than 10 different devices at x size. Third, storing and pumping heat involves losses, where there is a huge range of great options for storing electricity. Fourth, it takes massive turbines to turn heat into energy, so you need to scale several things on both sides of your merged heat system in your modular design.

Finally, X independent fusion devices don't have single points of failures shared between them. Your combined design would.

> there is a huge range of great options for storing electricity

Producers of wind and solar power would love to hear about them!

Re SPOF: since the fusion devices are much less reliable, as of now, than reservoirs of hot liquids, I suppose the reservoir is much less of a concern.

I understand that fusion efficiency grows with size; this is why we are surrounded with colossal self-initiated fusion reactions, and have one nearby! But before we can scale, maybe we could still turn net-positive with smaller, less reliable devices. Remember how unreliable first cars were.

> Maby we could still

Early computers where huge and broke down all the time. They where built that way because it was the easiest option at the time. Nobody wants to build multi billion dollar devices if you could get away with spending 1/10th as much or even test several designs at the same time they would.

PS: The grid absorbs solar and wind interment nature just fine without much in the way of storage. That might suggest something about large scale energy storage. Building something that can store GW’s worth of heat for minutes at a time is going to be huge and expensive.

> The grid absorbs solar and wind interment nature just fine without much in the way of storage.

I was under the impression that this was because we have a bunch of natural-gas power plants that are turned on when necessary to meet peak load.

This is regional, the general factors are over supply is required to deal with failures. So, their should always be excess capacity which can be routed around at 1/3 the speed of light. Demand can be adjusted in response to increased prices. Relatively tiny amounts of storage allow for time to react.

Large scale hydro for example can act as storage ramping up and down in minutes to cover demand spikes.

PS: Sure in the US it’s a lot of cheap natural gas right now. But many places don’t and still need to deal with the same issues.

> The grid absorbs solar and wind interment nature just fine without much in the way of storage

Not so fine, judging by [0] (page 57): "Ontario has committed to install about 2,500 MW of solar capacity in both its 2010 and 2013 Long Term Energy Plan.70 However, Ontario also has committed to install 7,500 MW of wind generation. These two commitments combined create a serious energy management problem for Ontario’s power system engineers and operators."

And they have hydroelectric, nuclear, natural-gas power plants, and sell electricity to neighboring states to compensate intermittent power supply.

[0]: https://www.ospe.on.ca/downloads/March-2016-Research-Report

Confining the plasma is easier in a larger reactor, so you might be better off building one big reactor instead of 20 small ones.

If confinement is easier in a large machine, why are not very large machines being built?

If this is economically infeasible, then again, building piecemeal may be better, in the same way as taking a credit and paying interest on it might be better if you can't secure the upfront sum anyway.

If confinement is easier in a large machine, why are not very large machines being built?

Money and politics. You might want to take a look at the original logo of the ITER project: https://www.iter.org/img/resize-900-90/www/content/com/Lists...

The Soviet Union collapsed, and the US pulled out of the project in 2000, rejoining only in 2006. This necessitated a down-sizing of the original design.

Cost. Building huge magnets and a huge vacuum vessel is simply expensive.

Fusion scales very well with two parameters: size and magnetic field strength.

Size works great, but it's just too expensive. Even if you could justify the costs in theory, ITER is demonstration that the political will to invest that much capital with such a long payoff time just isn't there.

Magnetic field strength is the new hope of the fusion industry: 'Cheap' high temperature superconductors have been commercialized and could make smaller fusion designs practical and economical.

This is kind of similar to how we use rockets instead of Lofstrom loops. The cost per kilowatt-hour of One Giant Fusion Reactor may be infinitestimal, but if you don't actually realize lower unit costs without a huge upfront investment, and there's no way to incrementally get some of the benefit for only some of the cost, it's a very hard sell.

Rockets are an easy sell because you can design a rocket that launches from Holland and lands on London and you can find someone to fund that. Later, based on that track record, you can design a rocket that launches from Florida and reaches low earth orbit, and you can find someone to fund that. After you pull that off, you can make a rocket that launches from Florida and reaches the moon, and you can find someone to find that based on your track record.

Conversely, if you wanted to build a Lofstrom loop, you would have to tell someone, "hey, there's no track record of something like this working, but give me tens of billions of dollars and you'll make it back by not blowing as much money on rockets". Nobody's gonna pay for that. And if you start building it piecemeal, you're still not going to get anything to space, and then people will point and laugh and talk about how Lofstrom loops are a stupid technology that will never work out because they're always another 20 years away, so you never get the necessary funding to actually finish the thing, which means people will just continue to point and laugh, and in allegorical form, this is the history of fusion power.

Please have a look how frigging huge ITER is. There is no other way to put it. This thing has its own campus. It is also delayed because it is the first of its kind and also underfunded with some mismanagement on top. But

> If confinement is easier in a large machine, why are not very large machines being built?

They are. ITER was designed to be as large as the engineering allows. But that is Expensive.

The groups at MIT are trying to take advantage of the fact that newer superconductors allow you to get to ITER levels with smaller designs. However, those are Not Cheap either.

Not cheap, but about an order of magnitude cheaper than ITER.

"If confinement is easier in a large machine, why are not very large machines being built? "

That's the plan. But first you have to figure out a lot of details in smaller machines.

This doesn’t solve the fact that a single fusion pulse (all existing reactors are pulsed) currently consumes more energy than it produces.

I remember hearing about break-even events for a few years. E.g. from 2014: https://www.nature.com/news/laser-fusion-experiment-extracts...

A design that produces e.g. 2x the energy it consumes for the startup could be viable, even if pulsed (running minutes, not months).

IIRC They were able to get net energy from the fuel versus the energy of the x-rays used to ignite it. But not in terms of the energy of the lasers used to create the x-rays by vaporizing the fuel pellet.

The idea of having many small fusion reactors is quite interesting. I never thought about it and i would normally dismiss it since building a big one is cheaper than building many little ones. However, SpaceX and their 9-engine Falcon rocket teach us a different story. Sometimes building smaller yet complicated power plants is cheaper than building fewer ones but bigger.

Would be great to know more about the economics behind a multi reactor power plant.

The real issue is surface to volune ratio of the contained plasma. This needs to be as small as possible for the containment to be effective (reduced energy losses through the surface area). So you want your reactors to be as big as possible.

From an engineering point of view, though, you want the reactor to be as small as possible. That's because output is limited by the power/area that the first wall can withstand. So, the volumetric power density will be inversely proportional to the linear dimensions of the reactor.

>BTW the waste heat could be directly reused

Yes, for district heating.

Not really. For power generation as in a coal or nuclear fusion or fission powerplant, you run a multi-stage steam turbine cycle to get down to as low a temperature as possible after the final LP turbine. Because that maximizes the Carnot efficiency of the (idealised) Rankine cycle, it also maximizes real world efficiency.

In practice, coing down to ~40 C (~100F) is not uncommon. And then the waste heat is basically unusable, even for district heating.

OTOH, if you need lots of heat (maybe you have a chemical plant or something nearby), you can design you powerplant for cogeneration of heat and power, yielding higher total system efficiency. But you don't do that "just" for district heating.

You can (and at least one nuclear plant does!) run greenhouses off 40-celsius air. Maybe some tropical fish aquaculture.

But no, nothing at infrastructure scale.

What is it with physicists and molten salt?

Every battery-tech, pro-nuclear, fusion related news have to have somebody chiming in "molten salt".

Just an observation.

Molten salts are hot!

SCNR. Seriously though, molten salts are great heat transfer media, a bit like water, just at higher temperature and still low pressure. They are also completely immune to radiation damage, which is a big deal if you have 14MeV neutrons flying around and something has to stop them. Chemistry in molten salts is also interesting. Things can be extracted (say, into a molten metal phase) or precipitated, and some reactions just happen at 600 degrees while they need platinum and palladium catalysts at room temperature. What's not to love?

Can you teach me more about the radiation properties of molten salts? Didnt know molten salt is immune to radiation, why would that be? Thats cool, though.

Because they (or, at least, the ones being discussed) have no covalent chemical bonds or other static arrangements of atoms that can be disrupted. The ions are all just single atoms with more or fewer electrons than the neutral versions of those elements. The atoms can be ionized by radiation, but the resulting weird oxidation states just fall back to chemical equilibrium as the electrons redistribute.

Molten salt has a high heat capacity and a higher boiling point than water, so you can get higher temperature without worrying about steam.

No pressure vessel required. Meaning no explosion possible. Intrinsic safety by design.

The confinement building of a light water reactor is large because it has to contain large volumes of steam in an accident.

In a molten salt reactor, there is very little volatile material inside the containment building. The salt itself does not have high vapor pressure, even in accident conditions. As a result, the size (and cost) of the containment building can be radically reduced. Moltex's design, for example, reduces the cost (per unit of power output) of the containment building by a factor of 5 vs. LWRs.

With fusion bigger is better.

Why is this upvoted? It shows a severe lack of physics knowledge.

The whole goal of fusion is to get the energy amplification, Q, to be > 1, which requires the triple product n T t to be above some threshold. n is density, T is temperature, t is confinement duration. This is Lawson's criteria, you should look it up to educate yourself.

Your proposal doesn't help n, T, or t, hence doesn't help Q.

Also, there are economic and physical gains from concentrating your efforts into making your fusion reactor larger, rather than trying to make it 1/10 the size and have 10x of them.

The economies of scale will happen once fusion ignition (Q>1) is achieved and we start mass producing dozens of fusion plants around the world. But at this point we're likely 10+ years away from achieving ignition.

DT fusion reactors have inherent diseconomies of scale. That's because their power output is limited by what the first wall can withstand. By the square-cube law, the volumetric power density of a DT reactor must go down as its linear dimensions increase.

This is not just quibbling, since bad power density is a huge problem with fusion.

Look at ITER. The volumetric power density (gross fusion power/volume of reactor proper, not including the building or auxiliary equipment) is 0.05 MW/m^3. Compare that to a PWR primary reactor vessel, which have a power density around 20 MW/m^3. The smaller designs like ARC or Lockheed's concept have power density around 0.5 MW/m^3.

I love these guys, steadily working through the issues, making steady progress. I think with a slightly different diverter design they could claim to have the worlds first plasma cannon :-).

Pretty insanely complex machine, and really awesome results. Congrats!

This is what I find impressive as well.

They have this absurd shape, because the algorithm says it works, and they say it contains plasma. It does.

They say they will test this to hold 10 - 12 seconds of plasma. It does.

They say they will work on this and expand this to hold 30 seconds of plasma. It does.

This just feels like really solid engineering or practical science. Those 30 minutes will happen without much delay.

Does anyone have any idea how do they take pictures of the plasma?

That seems quite hard considering the temperatures and the strong magnetic field.

Good question - I worked at the ASDEX Upgrade (AUG), also operated by the IPP but a tokamak instead of a stellarator: At AUG similar images and videos [0] are recorded by IR cameras. These cameras require strong cooling and a magnetic / electric shielding as described in [1]. Per plasma discharge (~10s) around ~8GB of video data is recorded. There are also fast frame cameras for dust tracking [2] and similar things. I assume the same techniques are applied at Wendelstein.

[0] https://www-ncbi-nlm-nih-gov.eaccess.ub.tum.de/pubmed/266281... [1] https://www.youtube.com/watch?v=QCK51vqWunU [2] http://iopscience.iop.org/article/10.1088/1741-4326/aa4e56

Wow...that's pretty cool!

I operate a fast camera at W7-X. It's a normal high-speed camera that records light in the humanly visible range, and the plasma emits in that range and reflects off the walls. IR cameras are most often used to gauge the temperatures of vessel wall components.

Our camera looks through a pinhole in the vessel wall, but it sits a few meters away from the machine and gets that view through a bundle of optical fibers. There wouldn't be enough space to place the camera right at the pinhole because of the magnets and their cooling systems, and the magnetic fields would be pretty high. The camera needs to be shielded from the fields for its electronics to work properly, and the shielding box perturbs the magnets' field, so moving the camera far away is a good idea. We don't worry about neutrons, because W7-X plasmas are fueled with stable helium and hydrogen (no deuterium or tritium so far, mainly due to onerous nuclear regulations in Germany), and these fuels don't produce many neutrons at all.

That's just awesome, thanks for the explanation!

As a side note, you gotta love HN... where you ask a question about some obscure thing and often get an answer straight from the source.

> Our camera looks through a pinhole in the vessel wall, but it sits a few meters away from the machine and gets that view through a bundle of optical fibers.

Wait, so does this mean that you have a camera obsucra with an array of optical fibres at its back, and then you have an ordinary CCD camera imaging the other end of the fibre array?!?!

I guess the word pinhole is misleading here, I think it's a few cm in diameter. Behind it there's an array of lenses that projects the view onto a fiber bundle, then a lens at the other end of the bundle projects that view out. That light goes through a beam splitter which shares it between our camera and another one in the shielding box.

Most imaging in fusion is done like this because of space constraints, magnetic fields, and neutron fluxes.

There is a video available here as well:


Those videos show surprising 'hotspot' lines - I would expect the plasma to be far more evenly distributed.

Because really hot plasma doesn't emit in visible range, it's all x-ray, so what you are seeing is it's (relatively) cool edge. It might have colder regions, but that doesn't necessarily mean that the main volume is unstable.

I can imagine that this is just the neutron flux picture. The neutron streams should pass the walls pretty well (unfortunately), so registering it should not be hard.

I doubt that. High energy neutrons are difficult to diffract (so no lenses), reflect more like light on fog than light on tin foil (so no mirrors) and pass through material too easily for a pinhole lens to be plausible.

I think you have to go to quite low energies (per neutron, not flux) for neutron optics to be a thing. Neutron cameras use collimated neutron sources to get around this, but that option isn’t available here: https://en.m.wikipedia.org/wiki/Neutron_imaging#Neutron_came...

You don't need to diffract the neutron flux. Instead, you can wrap (low-res) neutron-sensitive material around the stellarator tube. IDK now well would conventional photodiode or CCD work under high magnetic field, though.

That would just record the total neutron flux. It won't form an image unless you have a lens, aperture, or something similar.

It's almost as if you posit that the flux through every piece of surface is exactly the same. I suspect it's not.

No, total flux at a given location, not total flux in the whole device. The point is that it doesn't identify which direction the neutrons are coming from. That's why lens or apertures are needed.

I am obviously very uninformed about this, but in the picture I'm referring to [0] you can see the walls in pretty good detail, so I assumed it was an optical picture.

No idea if a neutron flux picture would provide these details or not.

[0] https://www.ipp.mpg.de/4550362/original-1543230147.jpg?t=eyJ...

I thought the main problem with nuclear fusion is neutron capture converting the structural materials which are no longer fit for purpose.

I'm quite keen on aneutronic fusion, Boron 11 + a neutron, which converts to three alpha particles which can lose their energy to magnetic fields generating electricity, no need for any Victorian steam engines.

The huge downside is that works at much much higher temperatures, but if you can't solve the neutron capture issue...

> The result was high plasma densities of up to 2 x 1020 particles per cubic meter – values that are sufficient for a future power station. At the same time, the ions and electrons of the hydrogen plasma reached an impressive temperature of 20 million degrees Celsius.

> Since the fusion fire only ignites at temperatures of over 100 million degrees, the fuel [...]

Would it be accurate to say that this gets us 20% of the way there, or is that overly simplistic?

Unfortunately overly simplistic. The challenge with higher plasma temperatures is that the internal shielding needs to be better than the current graphite tiles. They will upgrade those next and go with more complicated water cooled carbon fiber strengthened carbon tiles.

Note that Wendelstein 7x will never operate to produce actual fission with DT fuel, because that is (a) outside its mission goal, which is to research plasma behaviour at conditions close to what is needed in a power plant and (b) dealing with the neutron bombarding creates all sorts of complexities in terms of the blanket. Not really solved here as well, as Wendelstein 7x is a nightmare to disassemble and upgrade. For a power plant maintenance and serviceability needs to be build into the design. And (c) they don't have the government permission to deal with nuclear material and nuclear waste.

So no, we are not 20% there. But we have proven that (a) the Stellarator actually has wings, and (b) plasma physics so far behaves mostly as predicted. Especially the latter is something to be celebrated, because plasma is nasty in terms of physical properties, really complex to model, etc.

I found this panorama shot on their home page: https://www2.ipp.mpg.de/ippcms/eng/externe_daten_en/panorama...

In the words of peter griffin, why aren’t we funding this more?

The name is really scifi.

I love it!

Awesome progress, but I feel like the stellerator design won't be scalable/feasible :/

"Although Wendelstein 7-X is not designed to generate energy, the device is intended to prove that stellarators are suitable for use in power stations. With Wendelstein 7-X the intention is to achieve for the first time in a stellarator the quality of confinement afforded by competing devices of the tokamak type."

This was at the very end.

While competition is certainly a positive, this doesn't sound like they're interested. Therefore, given the importance of fusion (read: there's a massive immediate need on the order of saving the planet), shouldn't the time and effort being put into stellarators be devoted to something that's important in the immediate?

I would say the opposite. Achieving fusion power is so critical to the long term sustainability of human technological civilization that we cannot afford to put all of our eggs in one basket, and hope that we picked the right one.

Fusion is only necessary for long-term sustainability if you're talking about interstellar travel (and even then, fission may be fine).

There's a real physical ceiling to how much power can be consumed on Earth's surface, regardless of its source.

Fusion power is not critical to long term sustainability. There are perfectly adequate, even superior, alternatives. Fusion has inherent problems that, in my strong opinion, are going to render it uncompetitive against these alternatives.

Fair enough. And I agree. However, to be clear, that's __not__what this article says. Per the quote I pulled.

I think you're misinterpreting the quote, and that's probably why you were downvoted.

Nobody knows if it's easier to build a commercial fusion power plant with the tokamak design or with the stellarator design or with another of less known competing designs or with another design that nobody has imagined yet.

Moreover, perhaps some of these design (including thetokamak and stellarator) perhaps are nice for a lab demo or a model plant that doesn't break even or a tiny scale plant, but they may not scale to a full power plant that can produce energy at a useful level. (That's why this is called "research", you don't know if it will work.)

So they are following the usual path that is allocating some money to a few competing groups with different ideas and hope that one of them will get something useful. The problem is how to distribute the money between the alternative approach, and the method is a mix of analysis of the proposals, the intermediate results, grantbaits, and politics.

Agreed — the problem is that, from an energy perspective, it "doesn't break even".

Zach Hartwig of MIT has an excellent video [0] of how to evaluate announcements of nuclear fusion advances. The big problem is breaking even. There are other problem, too: plasma containment, etc.

[0] https://www.youtube.com/watch?v=L0KuAx1COEk

Neither tokamaks nor stellarators appear promising for reaching commercial viability. Both are very complex, very expensive, and will have very poor power density compared to fission reactors (making them much larger for a given power output). It's difficult to see how they could approach the levelized cost of energy that solar and wind have already achieved, never mind the costs of these decades hence.

60 years ago, when nobody knew anything about fusion (so not a very different situation from today...), the stellarator was the favored design. The advantages are obvious on paper (which is all they had back then): continuous operation, more stable plasma. But then the Russians built a small tokamak and it worked surprisingly well. So the whole world went

"given the importance of fusion, shouldn't the time and effort being put into stellarators be devoted to something that's important in the immediate?"

and started working on tokamaks. 60 years later, tokamaks still haven't been turned into power plants. Things were harder than imagined, and much more expensive, too. It doesn't look as if ITER will ever yield something practical. Maybe the MIT guys can make the Arc Reactor a reality; but all they have right now is... exactly, paper.

(The whole world? No! A tiny village outside Munich built the Wendelstein series of stellarators...)

Wendelstein is clearly targeted at research leading to fusion. It is just, that the Wendelstein 7-X experiment is purely limited to reasearching plasma containment. Once this experiment is concluded successfully - and all the results are very encouraging, implementing actual fusion would be the purpose of a successor to 7-X.

There's still ways to go. If they manage the same confinement as a tokamak they (hope to) have the advantage of continuous operation.

Why is this not important? Tokamaks are "better" right now mostly just because they have had a lot more time and energy invested in them. We need stellarator experiments to see if stellarators like this one might be valid.

> something that's important in the immediate

That's not easy to find in fusion research.

I am not an expert but from what I have read there are still a lot of unknowns in plasma physics and fusion so it makes sense to tackle fusion from a lot of different angles and not pick a winner already. In the end what's needed is more money for research. It'a s little sad that people are complaining about the high cost of ITER while spending multiples of that for military. Fusion may be a big contributor for peace in the future if it reduces the need for fighting about access to oil.

Given the importance of transportation for society, shouldn't this effort going into "automobiles" go towards better wooden wheels for horse-drawn carriages?

You have not adequately argued that a Depth-First Search is superior to a Breadth-First Search here (or that the balance here is wrong). The value you describe is the goal value which influences the search strategy not very much.

From my classes some years ago IIRC the main benefit of the Stellarator was the lack of a plasma current which is no longer needed for plasma heating due to advances in neutral beam injection technologies.

The plasma current seems to create some instabilities so we may be able to achieve better confinement with a device that doesn't have one.

I would also agree with the other comments that we can't afford to put all our eggs into one basket - we spend a pitifully small amount on fusion research.

The promise of stellarators is continuous operation, whereas tokamaks traditionally rely on inductive pulses for the two-fold purpose of confinement and heating.

The big advantage of stellarators vs. tokamaks is their lack of disruptions. Tokamak disruptions are a very serious problem that, if not well-solved, promise to put tokamaks into the technology garbage can. A disruption can, in the worst case, generate runaway relativistic electron beams that can melt holes through the first wall.

The first and primary goal of ITER is to show that the disruption problem can be and has been solved. If they don't do this, ITER will never be allowed to operate with a DT plasma. And they are still scrambling to solve the problem.

Are talking about ELMs? Isn't it a basically solved problem then? I was under impression that between KSTAR's active field disruption coils and pellet injection they are more or less controlled (and we'll have confirmation from MAST Upgrade quite soon). Or am I wrong?

No, disruptions are catastrophic events where confinement is lost very rapidly. As the plasma cools an electric field builds up from the decaying current, and this field can accelerate runaway electrons in narrow beams to relativistic energies. Worst case >70% of the stored magnetic energy from the current in the plasma gets dumped into these electrons. If they hit the wall they can explode a hole in it.


Ah, I see. It looks they did something about it though https://www.iter.org/newsline/-/3183 are you skeptical that that can work?

Some disruptions occur almost instantly. The system will have very little time to react. And how are they going to train or test it? A single unmitigated disruption will damage the reactor.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact