What if we collectively had the will to invest this much into fusion?
Personally, I think one of the answers to "tech seems to have stalled" is that it's easy to underestimate how important cheap computational power is, even without "AI", to moving forward. For another example, as fantastic as the 1960s space projects may have been, I think they just weren't economically sustainable, so it isn't that amazing that they didn't become a self-sustaining industry right away. I think it's easy to look at SpaceX and say "Gee, there's nothing that we couldn't have done there fifty years ago", but again, I suspect you miss just how much of the Falcon rocket is a result of extensive cheap computation abilities. Even if you could use modern computers to produce a design that could have worked fifty years ago, there's still not necessarily a practical path to that design using only tech from fifty years ago. (And of course the Falcon is full of stuff that couldn't exist fifty years ago.)
Modern tokamak designs are of course run through all kinds of computations nowadays too (because everything is), but the stellerator design, in a deep and fundamental way, simply isn't possible without massive computational power, whereas we've been building tokamaks since before massive cheap computational power.
Fusion would be similar.
Didn't people see the value that cheaper/more compute power will give them back than ?
It is called a paradox because seeing it in action usually catches people by surprise. And this is true whether you're talking about the consumption of coal to power factories, or the consumption of electricity to power computing (and in each case the myriad of new uses that efficiency promoted).
In Science Fiction over and over again the trope was of a giant computer that acted like an oracle. You see that in Asimov's work, in Heinlein's The Moon is a Harsh Mistress and so on. Basically nobody anticipated ubiquitous computing. For instance in the Foundation series you see that a computer run by the Second Foundation can predict the future course of history...and people are calculating their courses with slide rules.
As for the killer app of computing, email, I'm only aware of one pre-1970 work correctly anticipating what it would be actually like. (James H. Schmitz has a memorable scene in one of his Telzy stories where she catches up on her messages at a terminal. He doesn't say "email", but the scene is notable for unobtrusively getting it right.)
Now compare with Franchise, Jokester, All the Troubles of the World, The Last Question and many more stories that feature as a plot device a very powerful centralized computer named Multivac. (See https://en.wikipedia.org/wiki/Multivac for a more complete list.)
How the same mind could have come up with the idea of androids with positron brains following the three laws of robotics, and yet missed pocket calculators baffles me.
In many ways we're doing that exact thing. The computer at my desk is powerful but somewhat useless; the grand oracle in the form of Google et. al. does the real heavy lifting of giving me useful information. Whether it's a single computer or a bunch of separate ones acting as a whole is pretty immaterial in that sense.
For example, in 1964, the ATLAS computer went fully online in Manchester, England. It was the most powerful computer in the world, took up a floor of a university, and the word “supercomputer” was invented to describe it.
The requirements for the Apollo Guidance Computer were to make something with those approximate specs, but take up only 24×12.5×6.5 inches (61×32×17 cm), use 55 watts of power and be ready to fly in 1967. It was a crazy, impossible task.
I’m convinced the AGC was the biggest computing advance after the move general purpose computers.
"Didn't people see the value that cheaper/more electric power will give them back than ?"
It was just an enormous challenge to build this thing. Things like building and validating the field coils required a lot of effort and some novel approaches that they had to come up with along the way.
Source for that?
Looking at the German Wikipedia entry, the HELIAS method was invented in the late 80s. I don't see anything about the 7-X design actually having been computed back then. In fact, even the 2002 experiment Wendelstein 7-AS wasn't fully optimised.
Also, Tokamaks were invented in the 1950s. JET started operating in 1983. ITER was proposed in 1987. Given even just a little bit of path-dependency, there really wasn't any realistic overlap.
This paper published in 92 mentions that the 7-X was being designex at that time. I remember having seen images of the proposed field coil configuration in 91 or 92. So I might have bern off by a few years. But the plans must have bern complete for the project start in 1994.
I do research at W7-X, so I can confirm that to the best of my knowledge, the design was computed a long time ago. Building the actual device took a while.
Would you happen to know what kind of compute power was required? Was it run on a colleague's PC or workstation or did it require supercomputer time?
By the way, the unusual shapes of the coils can be understood intuitively from this picture: https://imgur.com/a/Bq3ABfQ. A plasma needs to be confined with a magnetic field in order to be heated to extreme temperatures, and a toroidal field (produced by the currents in the red coils) is unstable due to particle orbit drifts. You need to add a twist to the field for it to be stable (using the green coils). But if you unroll the surface of the torus, you can approximate the currents in both green and red coils using the discrete blue coils, and they're easier to build.
Alas, it "just" seems to talk about extending the coil design for a reactor, not the rest. I am guessing most of that would be similar as for other fusion reactors, see for example MIT's Pathway to Fusion Energy:
Though I don't really understand the significance of the "pebbles" in this system.
> the stellerator design, in a deep and fundamental way, simply isn't possible without massive computational power, whereas we've been building tokamaks since before massive cheap computational power.
does not square with Wikipedia:
> Stellarator ... The first Model A started operation in 1953 ...
It wasn't the computational revolution that made SpaceX possible, the basic technology is more or less the same as it was in the 60s. The Merlin engine has it's roots in the '90s NASA Fastrac design, evolved for performance and reusability.
SpaceX's success relates to things like market incentives, lean business practices, a fail early and iterate quickly attitude as opposed to "too big to fail" public projects, and so on.
I refer to Friction Stir Welding, the wonderful technique that welds aluminum alloys by plastic deformation below their actual melting points.
The landing algorithms for the F9 first stage make use of advances in optimization algorithms that weren't available 40 years ago as well (as well as exploiting faster processors).
I think it's also possible that technological process happens in fits and starts instead of at a continuous or even a continuously changing rate. Often there's an individual discovery that leads to a spurt of innovation that consists mostly of applying that singular discovery to different problems.
* The late 19th and early 20th century saw very fast innovation in vaccines and antibiotics, because the discovery of effective vaccines and antibiotics was general enough to solve entire classes of problems all at once. It seems like a lot of innovations all at once when you manage to find extremely effective treatments against polio, smallpox, measles, syphilis, chlamydia, typhus, typhoid, tuberculosis, etc., etc., etc., but really that was just from a couple of extremely general inventions playing themselves out.
* Steam engines led to large-scale mechanization while electrical power led to small-scale mechanization in two relatively fast and concentrated chunks. In both cases, the invention of a general technique for powering machines led to lots and lots and lots of machines.
* Marconi's wireless telegraph was invented in 1896, leading to different varieties of radio, television, and radar.
* Airplanes might be another example. I don't want to dismiss the huge difference between a Wright Flyer and a Boeing 737, but "make the airplane out of aluminum" and "have a tube that the crew and passengers can be inside of" are both pretty obvious improvements once you've figured out large-scale manufacturing and the Bayer process, both of which we had ahead of time. I guess jet engines were another breakthrough, although we kind of had them already and they're basically just rockets except they consume oxidizer from the air instead of carrying their own.
* Rockets. There were less than three decades between the V-2 and the moon because it turns out "building a big rocket" is the hard part. OK, there are a lot of other hard parts to landing on the moon, but they're hard in the way that they're achievable by any sufficiently well-funded and motivated group of people with rockets, but nearly impossible without rockets. As you point out, Falcon 9 is significantly more advanced in certain ways than Saturn V, but the low-hanging fruit was reaped by Werner von Braun.
* Computers: just like the Saturn V is basically a very, very, very large V-2 and so is the Falcon 9, we're basically using very, very, very large (logically large; physically small) versions of the computers we had in the 1970's. The fundamental discoveries in computing are all basically done; the rest is just playing itself out. The reason it seemed to take longer to play itself out is because computers are a self-compounding invention: you can use computers to design computers, and you can use better computers to design better computers. Also, having computers makes a lot of other things easier, so you get a lot of innovation from that, too.
Fusion, promisingly enough, also seems to be one of those technologies that will just unlock a bunch of really, really powerful innovations, seemingly all at once, because of how many problems we could solve by just throwing lots and lots and lots of energy at them. (Some discussion here: https://www.youtube.com/watch?v=8Pmgr6FtYcY . Also, the excellent book "Sustainable Energy - without the hot air" develops rough upper bounds on the amount of sustainable energy that the UK could produce and the amount of energy the UK consumes, ultimately leading to the bar graph Figure 18.1 here: https://www.withouthotair.com/c18/page_103.shtml. Later discussion of nuclear fusion places the same bar graph, to scale, next to the sustainable level of fusion power production on Figure 24.17 here: https://www.withouthotair.com/c24/page_173.shtml.)
At the same time, it's amazing how computing is helping every single field there. Even if the core discoveries are already made, the efficiency gains in the recent decades are tremendous:
* Classic windmills vs modern aerogenerators
* Analog transmissions vs highly multiplexed, high bandwidth digital transmissions
* First modern passenger planes vs more aerodynamic, more efficient planes
* Straight jet engines vs turbofans
* Saturn V vs Falcon 9 & Falcon Heavy
* Mainframes vs modern computing devices
We live in an era of refinement, where the old discoveries are being improved in ways that enable new uses that were in the realm of science-fiction not so long ago.
You should do a Netflix series, Phil!
Plus: GPS, Combustion Engine, The Web / Networking, Plastics/Chem revolution, and maybe soon AI?
Edit: found Wikipedia link: https://en.wikipedia.org/wiki/Connections_(TV_series)
The conversation came up because I was trying to fathom why a company would sit on billions of dollars when they could make more billions with it. This idea, that the future will have sudden insights that will take tremendous capital to exploit to create unassailable leadership positions, is not one I had considered. Fusion power, space opportunities, Etc, might be areas where things suddenly create an entirely new need.
But like everything Apple these days, they are just slow. Their Datacenter expansion already started late, and now with all the set back they are also much further behind. Similar to their Solar Energy which depends and linked to their DataCenter Expansion. And instead of spending money on their CDN ( Which they finally get around to doing so ), built their World Wide WiFi Network, ( iPhone users can enjoy free WiFi access like in Apple Store, but in far more places ), they decided to spend BILLIONS in making TV series and Drama.
They already are, in the cheapest, riskless way possible: those massive cash stashes are sustaining, or even boosting, their valuations.
When a company grows too big, it's in its best interest, as in investors best interest, not to spend a few billions on an initiative that could lead nowhere.
Consider SpaceX, its a growing business that took what, 10 billion in capital? That is below the 'material' threshold for some of these companies holding 100 to 300 billion dollars in cash equivalents.
Wouldn't that be too late? Or, put it this way: it is already a problem of capital. If you have enough, you can explore all technology tree branches till finding the one that leads you to it, even in parallel.
If there were a global carbon tax, it would give fusion an advantage against fossil fuels, but that would also be to the advantage of fission and renewables. In the absence of a global carbon tax, is fusion going to be cheap enough to displace fossil fueled power? We've had the technology for emissions-free electricity for decades, but it's difficult to profitably compete with fossil-based generators enjoying unpriced externalities.
But what about a commercial "DARPA" ?
I'm guessing that the innovations created by DARPA are a very good ROI in the value sense. If not, what is?
But like Xerox-Parc, the fear is that extracting money from that won't go well.
So one solution is creating a monopoly. But that's not ideal.
But what about some sort of insurance ? How would history look if Xerox-Parc was guarranteed to at-least break even?
>Bell Laboratories was, and is, regarded by many as the premier research facility of its type, developing a wide range of revolutionary technologies, including radio astronomy, the transistor, the laser, information theory, the operating system Unix, the programming languages C and C++, solar cells, the CCD, floating-gate MOSFET, and a whole host of optical, wireless and wired communications technologies and systems. Nine Nobel Prizes have been awarded for work completed at Bell Laboratories.
Always the question of money and focus though. IBM has a huge research arm which is doing amazing things, but their impact on the bottom line is perhaps not as strong as Ginny (IBM CEO) would like? At Sun, when Sun Labs was a thing, there were jokes about "Where good ideas go to die." or something along those lines. Commercial interests have a really hard time seeing value in these things when they are done internally.
And for a company that's not a startup, Lockheed Martin has a fusion project.
Companies get a quarter trillion in cash by doing something really well that others aren't doing. Nuclear fusion powered perpetual motion machines certainly fit the bill...
It does. You jest, but competition is fierce and little things matter; also, sometimes making even a small improvement require a technological breakthrough.
Some other numbers for comparison:
NASA's current ANNUAL budget is $19B. 
The BP oil spill cost $40B. 
"A 2016 IMF study estimated that global fossil fuel subsidies were $5.3 trillion in 2015, which represents 6.5% of global GDP." 
The US alone spent $600B on fossil fuel subsidies in 2015. 
>It's not obvious that throwing more money at the problem is going to change this.
This is true, but we haven't funded it enough. The potential upside is insane. We aren't funding research accordingly.
>This is true, but we haven't funded it enough. The potential upside is insane. We aren't funding research accordingly.
How much money do you propose we throw at it then because even fission isn't profitable these days and that is something we know how to do. While I agree that it is worth something to pursue, wind/solar + batteries + smart grid are a way better investment these days since we cannot even produce a fusion reactor that is net positive in energy let alone economical.
Now, it is true that there's still fundamental research to be done until energy production via nuclear fusion becomes a reality, but we're reasonably confident that we could make the conventional approaches work.
However, that research is Big Science, and there's no political will to fund it properly (the graph that people like to cite is https://commons.wikimedia.org/wiki/File:U.S._historical_fusi... ). ITER suffered from this as well, and W7-X basically only exists because German reunification happened, and the German government was looking for a big science project - any project - they could leverage to funnel money into East Germany. The plans for W7-X just happened to be ready at the right time.
Not all of it. The big science approach is a function of "pure" fusion research done from the assumption that containment/ignition only comes from magnetic fields and particle collisions. Some are approaching it from different directions, even using physical forces to trigger ignition (ie slamming the hydrogen with a big hammer). This isn't crazy stuff, just a more practical approach from people who see the problems from a different perspective.
"At the centre will be a sphere, three metres in diameter, inside which molten lead swirls at high speed creating a vacuum, or vortex, in the middle. Arrayed around it will be 200 to 300 pistons, each the size of a cannon. Firing in perfect harmony, they will create an acoustic wave that collapses the vortex at the very moment a plasma injector shoots hydrogen isotopes, the nuclear fuel, into it. If General Fusion has its physics right, the heat and pressure will ignite a fusion reaction that spins off countless neutrons which will heat the lead even more. Pumped through a heat exchanger, that hot lead will help generate steam just like a conventional thermal power plant."
Confinement in the spheromak was also unacceptable; fast electrons were lost too easily from the edge, causing too much cooling.
Instead, the new scheme compresses more slowly, and compresses a spherical tokamak plasma. This scheme will have a solid post running down the center of the plasma, and the metal will implode from the sides at lower speed. The burn will take about 1 millisecond.
I am skeptical of this approach, since the center post will be exposed to extreme conditions (average neutron fluence two order of magnitude higher than in conventional fusion reactor designs) without a thick layer of liquid metal to shield it. The non-acoustic compression scheme also means the reactor vessel will have to withstand extreme pressure.
That's the first time I've heard that. Brilliant!
At least, that's the story I've been told by one of the current top W7-X people over a glass of beer, though the main topic at the time was that the movers and shakers behind such big projects might not see them to completion due to retirement or even death.
Some of the progress in recent years is from the development of ideas that can be tried at much smaller scales, like the ST40:
This is something that spherical tokamak research has been hammering away at for years:
See Inertial Confinement Fusion, The NIF, High-Energy-Density Physics, etc.
That's not very long. The first liquid-fueled rocket was in 1914, and the moon landing wasn't until 1969, and even that was only possible with very high levels of funding far beyond what's invested in fusion power today. Babbage built the Difference Engine in 1822 and conceived of the Analytic Engine in 1837, leaving skeptics dismissing the potential of computers for 150 years. Leonardo da Vinci sketched a helicopter in 1480, almost 500 years before one could actually be constructed. Gunpowder was invented probably before the year 1000 and took over 700 years after that to render pikes obsolete.
Now, it's entirely possible that there's some other limitation holding us back from fusion, just as the lack of internal combustion engines or diodes held us back from helicopters or computers. But I think we've done enough work so far that we'd be able to tell if we were running into something like that. As it stands, I think we're roughly in the place rocketry was before WWII--we have an idea of how to do it, and we don't know if it'll go to the Moon or not, but the only way to find out for sure is to try, and that's pretty expensive and might take awhile.
It's interesting examining the recent efforts, public and private, in light of these venerable critiques.
In the US alone, oil subsidies are >$400bn a year.
Here’s an article - about the same amount, annualised, is spent on Halloween costumes for pets.
We'd gladly be working on fusion (or other physics stuff) but our present jobs pay much better.
More money, and better pay for hard science careers, actually would make a difference.
However, this is not necessarily true. It not only has to work, it also has to be economical and not have significant drawbacks.
I think that's very much up in the air. Compare it to fission - we know how to build those power plants, but it's just so damn hard that it's too expensive, compared to alternatives like photovoltaics and wind turbines.
And those are getting cheaper and cheaper at an incredible rate at the moment. So it's a moving target.
And yes, solar and wind rely on intermittent sources of energy, so you need to combine them with some kind of dispatchable source/storage to match the demand curve, but you have that too with a fission plant because of the high capital cost, and I don't see why it wouldn't be true for fusion too, unless somebody comes up with a way of building a really cheap fusion plant.
It is actually pretty hard to come up with a solution that's going to be safe for 1000 of years.
I might be wrong but I think all countries so far has opted for the solution of putting the stuff somewhere they can keep an eye on it, and possibly get it up again in case somebody figures out a use for it.
Fission is already in not too different a place than that: fuel is nowhere near the dominant cost per unit of energy produced. I hope that ends up not being the case with fusion, but it's hard to predict at this point.
The issue is the design of extremely specialized components, not necessarily the raw materials (except when they're novel superconducting magnets designed for that purpose).
I think schedule delays are what is making ITER so expensive.
Things seem to be going pretty well for TAE, at least as far as plasma confinement goes. Whether they can get net power from boron is another question, of course. I definitely see them as a dark horse compared to the tokamak companies.
TAE's response was not at all strong, IMO. And that was not the only critique.
Plasma confinement is not the issue for TAE. The problem is even with perfect confinement, their non-Maxwellian scheme doesn't work.
The problem is that nobody knows how to build a functioning reactor yet or if it's even possible to build a functioning reactor. It's very possible that we could throw trillions of dollars at this and still not figure it out.
Possible, but unlikely (imo). It's just that private business doesn't really do large-scale multi-decade pure research efforts with far-off ROI, and governments have chosen to spend money elsewhere. Hence, progress is steady, but slow.
I don't buy this. Generating electricity from fission is stupid simple once you figure out that piling a bunch of uranium produces heat (literally what Fermi did in Chicago in '42).
The safety features to keep the whole thing regulated, not melting, and to avoid poisoning everyone around it obviously aren't simple, but the process of creating and capturing energy as electricity is.
By contrast, the process of capturing the energy and converting it is a huge part of the challenge with fusion.
Fusion power has the potential to be a serious game changer and everybody should have access to it.
The big problem is that fusion has reached its end-game in military applications: we have the h-bomb. Sure, there's the hypothetical fusion submarines and carriers, but what do those really provide beyond their fission equivalent?
If we could find some novel thing that fusion power solves for the military, then fusion would be solved.
We should invest much more heavily into fission, and keep deuterium/tritium fission investment at around the current levels.
Why? Because deuterium/tritium fission like the stellerators and tokomaks use produces a lot of (energetic) neutrons, along with heating the plasma. That is very bad for two reasons. First, essentially unrecoverable energy is wasted by the neutrons. Second, the neutron exposure causes the reactor to become radioactive. So, the promise of "clean" fusion energy with no radioactive waste is not realized.
Aneutronic fusion is possible, but requires higher energy and different fuel. This is not yet being widely researched, but some interesting work is being done at LPP Fusion: https://lppfusion.com/
That is an effort I'd like to see funded much more fully.
At any rate, next-generation fission reactors are just as safe as deuterium/tritium fusion, are well understood, and many designs produce less and shorter-lived waste than the dinosaur PWR designs. It is beyond silly to not fully leverage clean, cheap fuel with a million times the energy density of fossil fuel while we work out the kinks with fusion. There is a necessity for reliable grid power alongside unreliable "renewables", and fission is the only viable CO2 free approach for the foreseeable future.
My key takeaway from this is that market/capitalism are really good at finding the optimal solution to an engineering problem using the currently available tech, however, its not so much for cases where you need the enabler tech also to be invented. IMHO this is where govt needs to step up and bring all the necessary pieces in the realm of possibility. after this private business can pick it up and optimize.
on a different note, its really amazing that they can maintain superconductors at -100 F just a few inches from a plasma at 100M F.
If it were to turn out that fusion power isn’t technologically feasible within useful parameters (currently achievable technology, useful scales, economic feasibility of recovering the costs in the lifetime of a reactor, etc), we’d have squandered vast resources pointlessly.
For a start there are dozens of different approaches currently being pursued. Which do we bless with the billions of extra funding? All of them? Suppose none of them pan out?
It’s not like fusion isn’t being actively investigated and invested in. It is, to the tune of tens of billions of dollars. How about we see how those projects pan out in practice and then progress from there?
Not necessarily, because discovering a large blocker would give us an idea of what to invest in next.
The economic impact of fusion power is great enough that even if there's a 10-25% probability of it working out, the expected value implied by that probability would justify significantly higher levels of funding than fusion currently receives.
> It’s not like fusion isn’t being actively investigated and invested in. It is, to the tune of tens of billions of dollars. How about we see how those projects pan out in practice and then progress from there?
ITER, the largest multinational fusion project, is projected to have a total cost of over 20 billion euros, which is about 22 billion USD. 22 billion USD is less than NASA's budget, adjusted for inflation, for the single years of 1963 through 1970 or 1990 through 1993. And NASA's work is built on top of the rocketry work of Nazi Germany, which cost about 40 billion inflation-adjusted dollars even with, to be euphemistic, artificially low labor costs.
For another point of comparison, the Persian Gulf War, which was primarily motivated by the attempt to secure part of the global supply of petroleum, cost $61 billion. The Iraq War cost at least an order of magnitude more, but (a) many of that was due to cost overruns and (b) that war was partially motivated by issues other than securing part of the world's petroleum supply.
In terms of improving long term human living conditions and enabling future economic growth, developing fusion power would provide tremendous benefits.
Well, if the shape is the output of an optimization program, as long as the program itself and its inputs can be simply specified, the shape actually has low Kolmogorov complexity.
(For comparison, the human genome is under a GiB.)
- The randomness and lack of symmetry suggest that we are lacking insight into the problem, like an inelegant physics explanation or an over-fitted model. It's not just that it's complex, but it doesn't even seem to have "parts" that work together. Conceptually fusion seems pretty simple: why in principle does harnessing it require super-human complexity?
- If such extreme contortions and micro-optimizations are required to get this far, how much potential can there still be on this path? In software, if you are replacing divisions with bit shifts, maybe you really need a different algorithm. I understand this is just a research project, but if everything is so finely tuned already, how do you make improvements?
I don't know anything about fusion power, but those are the intuitions I feel when I see the plasma vessel photo and read how it was designed.
No, it doesn't.
> Conceptually fusion seems pretty simple: why in principle does harnessing it require super-human complexity?
Because having 2e30 kg ball of hydrogen within a powerplant is extremely unwieldy here on Earth.
> In software, if you are replacing divisions with bit shifts, maybe you really need a different algorithm.
Huh? If your algorithm says "divide by 4", why would you need a different algorithm.
nothing is simple, nothing is elegant. Everything is hideously complex, we are just standing on the shoulders of giants.
e.g. Try making common things that we take for granted, like building a plane or glider or even writing a text editor from scratch (including the font rasterization code and display driver).
All your useful materials melt at 2000 degrees.
The fusion takes place at 20000000 degrees.
(Yes I know these numbers are grossly oversimplified.)
Let's assume that plasma destabilisation does not damage the device, and is a mundane event.
Build 10 or even 20 fusion devices (economy of scale!) feeding the common heat buffer, e.g. a large reservoir of a molten salt or metal. Feed conventional turbines off the heat of the heat tank. The tank evens out the input power jumps.
Now we can restart the fusion in every fusion device every so often, provided that restarting it is made a mundane operation, too. It, of course, takes a lot of electricity to pump into the magnets. Conveniently, we have a mighty power plant right here. Dumping the magnetic/electric energy from the magnets requires a huge sink. Luckily, we already have such a sink co-located.
Building the plant takes a massive investment. Luckily, the architecture allows to build it piecemeal, feeding the next added unit with the power of the already built units.
BTW the waste heat could be directly reused in some kind of chemical processing, like smelting, or maybe even synthesis of carbohydrate fuels from ambient carbon dioxide and water.
Second, fusion devices scale very well so one device at 10x the scale is vastly better than 10 different devices at x size. Third, storing and pumping heat involves losses, where there is a huge range of great options for storing electricity. Fourth, it takes massive turbines to turn heat into energy, so you need to scale several things on both sides of your merged heat system in your modular design.
Finally, X independent fusion devices don't have single points of failures shared between them. Your combined design would.
Producers of wind and solar power would love to hear about them!
Re SPOF: since the fusion devices are much less reliable, as of now, than reservoirs of hot liquids, I suppose the reservoir is much less of a concern.
I understand that fusion efficiency grows with size; this is why we are surrounded with colossal self-initiated fusion reactions, and have one nearby! But before we can scale, maybe we could still turn net-positive with smaller, less reliable devices. Remember how unreliable first cars were.
Early computers where huge and broke down all the time. They where built that way because it was the easiest option at the time. Nobody wants to build multi billion dollar devices if you could get away with spending 1/10th as much or even test several designs at the same time they would.
PS: The grid absorbs solar and wind interment nature just fine without much in the way of storage. That might suggest something about large scale energy storage. Building something that can store GW’s worth of heat for minutes at a time is going to be huge and expensive.
I was under the impression that this was because we have a bunch of natural-gas power plants that are turned on when necessary to meet peak load.
Large scale hydro for example can act as storage ramping up and down in minutes to cover demand spikes.
PS: Sure in the US it’s a lot of cheap natural gas right now. But many places don’t and still need to deal with the same issues.
Not so fine, judging by  (page 57): "Ontario has committed to install about 2,500 MW of solar capacity in both its 2010 and 2013 Long Term Energy Plan.70 However, Ontario also has committed to install 7,500 MW of wind generation. These two commitments combined create a serious energy management problem for Ontario’s power system engineers and operators."
And they have hydroelectric, nuclear, natural-gas power plants, and sell electricity to neighboring states to compensate intermittent power supply.
If this is economically infeasible, then again, building piecemeal may be better, in the same way as taking a credit and paying interest on it might be better if you can't secure the upfront sum anyway.
Money and politics. You might want to take a look at the original logo of the ITER project: https://www.iter.org/img/resize-900-90/www/content/com/Lists...
The Soviet Union collapsed, and the US pulled out of the project in 2000, rejoining only in 2006. This necessitated a down-sizing of the original design.
Fusion scales very well with two parameters: size and magnetic field strength.
Size works great, but it's just too expensive. Even if you could justify the costs in theory, ITER is demonstration that the political will to invest that much capital with such a long payoff time just isn't there.
Magnetic field strength is the new hope of the fusion industry: 'Cheap' high temperature superconductors have been commercialized and could make smaller fusion designs practical and economical.
Rockets are an easy sell because you can design a rocket that launches from Holland and lands on London and you can find someone to fund that. Later, based on that track record, you can design a rocket that launches from Florida and reaches low earth orbit, and you can find someone to fund that. After you pull that off, you can make a rocket that launches from Florida and reaches the moon, and you can find someone to find that based on your track record.
Conversely, if you wanted to build a Lofstrom loop, you would have to tell someone, "hey, there's no track record of something like this working, but give me tens of billions of dollars and you'll make it back by not blowing as much money on rockets". Nobody's gonna pay for that. And if you start building it piecemeal, you're still not going to get anything to space, and then people will point and laugh and talk about how Lofstrom loops are a stupid technology that will never work out because they're always another 20 years away, so you never get the necessary funding to actually finish the thing, which means people will just continue to point and laugh, and in allegorical form, this is the history of fusion power.
They are. ITER was designed to be as large as the engineering allows. But that is Expensive.
The groups at MIT are trying to take advantage of the fact that newer superconductors allow you to get to ITER levels with smaller designs. However, those are Not Cheap either.
That's the plan. But first you have to figure out a lot of details in smaller machines.
A design that produces e.g. 2x the energy it consumes for the startup could be viable, even if pulsed (running minutes, not months).
Would be great to know more about the economics behind a multi reactor power plant.
Yes, for district heating.
In practice, coing down to ~40 C (~100F) is not uncommon. And then the waste heat is basically unusable, even for district heating.
OTOH, if you need lots of heat (maybe you have a chemical plant or something nearby), you can design you powerplant for cogeneration of heat and power, yielding higher total system efficiency. But you don't do that "just" for district heating.
But no, nothing at infrastructure scale.
Every battery-tech, pro-nuclear, fusion related news have to have somebody chiming in "molten salt".
Just an observation.
SCNR. Seriously though, molten salts are great heat transfer media, a bit like water, just at higher temperature and still low pressure. They are also completely immune to radiation damage, which is a big deal if you have 14MeV neutrons flying around and something has to stop them. Chemistry in molten salts is also interesting. Things can be extracted (say, into a molten metal phase) or precipitated, and some reactions just happen at 600 degrees while they need platinum and palladium catalysts at room temperature. What's not to love?
In a molten salt reactor, there is very little volatile material inside the containment building. The salt itself does not have high vapor pressure, even in accident conditions. As a result, the size (and cost) of the containment building can be radically reduced. Moltex's design, for example, reduces the cost (per unit of power output) of the containment building by a factor of 5 vs. LWRs.
The whole goal of fusion is to get the energy amplification, Q, to be > 1, which requires the triple product n T t to be above some threshold. n is density, T is temperature, t is confinement duration. This is Lawson's criteria, you should look it up to educate yourself.
Your proposal doesn't help n, T, or t, hence doesn't help Q.
Also, there are economic and physical gains from concentrating your efforts into making your fusion reactor larger, rather than trying to make it 1/10 the size and have 10x of them.
The economies of scale will happen once fusion ignition (Q>1) is achieved and we start mass producing dozens of fusion plants around the world. But at this point we're likely 10+ years away from achieving ignition.
This is not just quibbling, since bad power density is a huge problem with fusion.
Look at ITER. The volumetric power density (gross fusion power/volume of reactor proper, not including the building or auxiliary equipment) is 0.05 MW/m^3. Compare that to a PWR primary reactor vessel, which have a power density around 20 MW/m^3. The smaller designs like ARC or Lockheed's concept have power density around 0.5 MW/m^3.
Pretty insanely complex machine, and really awesome results. Congrats!
They have this absurd shape, because the algorithm says it works, and they say it contains plasma. It does.
They say they will test this to hold 10 - 12 seconds of plasma. It does.
They say they will work on this and expand this to hold 30 seconds of plasma. It does.
This just feels like really solid engineering or practical science. Those 30 minutes will happen without much delay.
That seems quite hard considering the temperatures and the strong magnetic field.
Our camera looks through a pinhole in the vessel wall, but it sits a few meters away from the machine and gets that view through a bundle of optical fibers. There wouldn't be enough space to place the camera right at the pinhole because of the magnets and their cooling systems, and the magnetic fields would be pretty high. The camera needs to be shielded from the fields for its electronics to work properly, and the shielding box perturbs the magnets' field, so moving the camera far away is a good idea. We don't worry about neutrons, because W7-X plasmas are fueled with stable helium and hydrogen (no deuterium or tritium so far, mainly due to onerous nuclear regulations in Germany), and these fuels don't produce many neutrons at all.
As a side note, you gotta love HN... where you ask a question about some obscure thing and often get an answer straight from the source.
Wait, so does this mean that you have a camera obsucra with an array of optical fibres at its back, and then you have an ordinary CCD camera imaging the other end of the fibre array?!?!
Most imaging in fusion is done like this because of space constraints, magnetic fields, and neutron fluxes.
I think you have to go to quite low energies (per neutron, not flux) for neutron optics to be a thing. Neutron cameras use collimated neutron sources to get around this, but that option isn’t available here: https://en.m.wikipedia.org/wiki/Neutron_imaging#Neutron_came...
No idea if a neutron flux picture would provide these details or not.
I'm quite keen on aneutronic fusion, Boron 11 + a neutron, which converts to three alpha particles which can lose their energy to magnetic fields generating electricity, no need for any Victorian steam engines.
The huge downside is that works at much much higher temperatures, but if you can't solve the neutron capture issue...
> Since the fusion fire only ignites at temperatures of over 100 million degrees, the fuel [...]
Would it be accurate to say that this gets us 20% of the way there, or is that overly simplistic?
Note that Wendelstein 7x will never operate to produce actual fission with DT fuel, because that is (a) outside its mission goal, which is to research plasma behaviour at conditions close to what is needed in a power plant and (b) dealing with the neutron bombarding creates all sorts of complexities in terms of the blanket. Not really solved here as well, as Wendelstein 7x is a nightmare to disassemble and upgrade. For a power plant maintenance and serviceability needs to be build into the design. And (c) they don't have the government permission to deal with nuclear material and nuclear waste.
So no, we are not 20% there. But we have proven that (a) the Stellarator actually has wings, and (b) plasma physics so far behaves mostly as predicted. Especially the latter is something to be celebrated, because plasma is nasty in terms of physical properties, really complex to model, etc.
I love it!
This was at the very end.
While competition is certainly a positive, this doesn't sound like they're interested. Therefore, given the importance of fusion (read: there's a massive immediate need on the order of saving the planet), shouldn't the time and effort being put into stellarators be devoted to something that's important in the immediate?
There's a real physical ceiling to how much power can be consumed on Earth's surface, regardless of its source.
Moreover, perhaps some of these design (including thetokamak and stellarator) perhaps are nice for a lab demo or a model plant that doesn't break even or a tiny scale plant, but they may not scale to a full power plant that can produce energy at a useful level. (That's why this is called "research", you don't know if it will work.)
So they are following the usual path that is allocating some money to a few competing groups with different ideas and hope that one of them will get something useful. The problem is how to distribute the money between the alternative approach, and the method is a mix of analysis of the proposals, the intermediate results, grantbaits, and politics.
Zach Hartwig of MIT has an excellent video  of how to evaluate announcements of nuclear fusion advances. The big problem is breaking even. There are other problem, too: plasma containment, etc.
"given the importance of fusion, shouldn't the time and effort being put into stellarators be devoted to something that's important in the immediate?"
and started working on tokamaks. 60 years later, tokamaks still haven't been turned into power plants. Things were harder than imagined, and much more expensive, too. It doesn't look as if ITER will ever yield something practical. Maybe the MIT guys can make the Arc Reactor a reality; but all they have right now is... exactly, paper.
(The whole world? No! A tiny village outside Munich built the Wendelstein series of stellarators...)
> something that's important in the immediate
That's not easy to find in fusion research.
The plasma current seems to create some instabilities so we may be able to achieve better confinement with a device that doesn't have one.
I would also agree with the other comments that we can't afford to put all our eggs into one basket - we spend a pitifully small amount on fusion research.
The first and primary goal of ITER is to show that the disruption problem can be and has been solved. If they don't do this, ITER will never be allowed to operate with a DT plasma. And they are still scrambling to solve the problem.