Hacker News new | comments | ask | show | jobs | submit login
The first plasma: the Wendelstein 7-X fusion device is now in operation (mpg.de)
383 points by aurhum on Dec 10, 2015 | hide | past | web | favorite | 94 comments



Here is a time lapse video of the 9 year construction, quite fascinating to watch: http://www.ipp.mpg.de/115632/zeitraffer_w7x

YouTube mirror: https://www.youtube.com/watch?v=u-fbBRAxJNk

A more technical video explaining how it works: https://www.youtube.com/watch?v=lyqt6u5_sHA


Video title: "Fusion reactor designed in hell makes its debut"


That's related to a quote from this article about how hard the thing was to build: http://news.sciencemag.org/physics/2015/10/feature-bizarre-r...


Pretty cool. One of the more interesting things about fusion research is how the expansion of the ability to do complex fluid dynamics has helped design plasma flows. Such designs were computationally infeasible before. The inter-related requirement of computation and understanding on the delivery of viable fusion solutions has been, in my opinion, the unseen anchor of fusion.


I think people know this, they just don't want to admit it. As someone who works with people who used to work with ICF, I've been told the reason that NIF flunked is that it was designed with 1D simulation codes because back then, that's all they could manage. One of the issues they inevitably ran into were plasma instabilities similar to Rayleigh-Taylor instabilities[0] that one only sees in two (or more) dimensional systems, but not 1D. These instabilities reduce the efficacy of heating and compression needed to achieve ignition, and were an unpleasant surprise when they actually performed the experiments. Even to this day, 2D simulations are fairly common and littered throughout the literature, and their justification (arguments of approximate symmetry) are suspect, at least to me. 3D sims are coming on the scene, but they often have to sacrifice things like spatial resolution that hurt's the simulation's physics in other way.

Computational feasibility is one of the things physicists in that field need to admit is holding them back, and may be a little humility in seeking out advice from computer scientists is warranted in my opinion[1].

[0] https://en.wikipedia.org/wiki/Rayleigh%E2%80%93Taylor_instab...

[1] this statement won't give me much criticism here, but it wouldn't go down well in a room of my peers.


Honest question: do you really think talking much with computer scientists would help? My impression is that most of the improvements (besides Moore's law) in the power of simulation of physical systems has come from either new physical insights, or computational insights developed by physicists who were mostly self-taught numerical techniques. It seems very plausible to me that there are honest-to-goodness CS insights that remain unexploited, such as better parallelizing physical simulations, but the track record hasn't been good. Why aren't CS people coming in and showing everyone how it should be done?

(For what it's worth, I'm a physicist, but not one who does anything numerical, and I'm happy to admit arrogance by physicists could be a problem here.)


I'm doing a PhD in physics using using large scale numerical codes. From what I can see more computer-science would mainly improve the maintainability rather than the speed of codes used. They are often already heavily optimized and, for the numerical heavy lifting rely on standard libraries such as Intel's MKL. Also scalability is usually quite good since the codes are routinely developed to run on a large number of CPUs in parallel. GPU support is coming slowly since it's also new to the people who developed Fortran codes for the last 20 years. For example VASP, a popular atomic structure simulation, is getting GPU support just now through a public-private partnership with nVidia. The speed-up for a 16-core sandy-bride machine with and without two GPU cards is about up to factor 8.


If you ask me they mostly need software engineers, not “CS people” especially. Whenever I talk to a scientist about their software, I see great opportunities and interesting problems that are solved in suboptimal ways. To be blunt, scientists can't code (and neither can most “CS people” for that matter). Every scientist (be it physics or CS) could use a professional software people, who have experience with developing and maintaining complex code bases. The problem is, there is no funding. Let's say a researcher gets a budget for some hardware and 10 PhD students, then there is nothing left to pay the engineer with. We're too expensive.


Software quality is just not something academia optimizes for. It only counts whether you can get your paper out. The chances that somebody else actually uses your software are slim, so investing time that could be spent writing more papers into better engineering makes little sense.


What's really funny is I started in software (have code committed to some open source projects... BIND, Valgrind), and then moved to Physics. The code is garbage because the constraints are totally different. Experiments and code can be brittle and still work. There's no time to consider user experience for fellow grad student when there's so much else to do... though I do know of SOME students who make time to write good code. Hats off to them.


I'm a mechanical engineer in physics and I've had no formal training in writing code or computer science, yet, I am now doing simulations for atomic physics in Python. I am designing some software library package which will serve as a sort of diagnostic/design tool for future experiments with a machine being built.

I've run into the problem of figuring out how to organize the whole dang thing and sometimes have to go back and re-write a lot of code. Can you recommend any good books that introduce computing concepts for applied physics and engineering?


My favorite general guidebook for writing good, readable, maintainable, defensive code is "Code Complete": http://www.amazon.com/Code-Complete-Practical-Handbook-Const...

It isn't domain specific but there is so much good advice in there around variable naming, code structure, general principles of bug-resistant coding that you will get a tremendous amount from it.


Not sure that would help though. You want someone with the theoretical knowhow to understand the problem at hand, so you can rewrite it to be simpler to program. In other words, a CS major with programming skills. They are in very high demand.


Essentially what you're talking about is a lab assistant who can code? Sounds like a good idea, and might be something for people who work at companies that do pro-bono charity work.


Optimizations, GPUs and even parallelization are not always much use in simulations because the computational complexity is usually very high - like O(n^3) or so. That's not even counting the cubing of the problem size in going from 1D to 3D. If you can get 10x the performance out of some optimization, that can allow you to solve a slightly bigger problem - but isn't likely to be dramatic enough to go from 2D to 3D.


There are bigger optimizations than 10X with some of the adaptive grid / wavelet compression stuff. I think these are still 2D, but pretty amazing: https://youtu.be/vYA0f6R5KAI


I was going to link to this too. It's almost as if the internet is calling for us to write an app called "here, the internet wants to help you researchers... have a video..."


CS people aren't coming in and showing how it should be done probably for the same reason that Physicists don't show up at a bridge construction project and show the engineers how it should be done.

Theoretical computer science is quite distinct from coding and large scale numerical computation practice. Just as theoretical physics, or even experimental physics, is quite different from structural engineering.


Except that physicists do come in and show biologists how it's done for all sorts of mathematical problems in biology! People call it "biophysics" but very often it's things where there is really no physics per se, like statistical analysis of genomics.


That might be because the Physicists had help from Mathematicians before and are just passing it on. I don't think that's quite true though, i like to believe that polymaths and university aren't quite dead in academics.


I think you might need hire engineers. What I mean is, someone in cs wouldn't have any papers to produce from applying well known parallelism techniques to your problem. In academia, papers and innovation in one's field are the motivation. What physicists appear to need is well established and understood. Hire a parallelism expert


> Why aren't CS people coming in and showing everyone how it should be done?

Because CS students don't learn differential equations or real and functional analysis.


Have a B.S. in CS and I learned both differential equations and real analysis.


?

What school did you go to? What school would issue a BSc in computing science without analysis and differential equations?


I did undergrad at UMass Amherst, where computer-science students take math up to multivariable calculus 1 and linear algebra. I then did my MSc at the Technion, where some amount of differential equations and vector calculus is normal for undergrads, but even there, requiring multiple semesters of analysis to grant a degree in computing, where most of the maths are discrete, is considered overkill.


It's not all about algorithms. Unless current simulations are running on Asics or even GPU's there are probably 10-1,000+x speedups possible.


Why do you assume that the computational fluid dynamics folks are dumb and haven't done this?

CFD is a very vibrant field with applications across the entire spectrum of engineering.

In fact, modern supercomputers are MOSTLY running CFD codes (weather prediction is effectively a gigantic CFD simulation). Nothing else really demands that level of power.


I know some CFD is still run on CPU's and some is run on GPU's. But, I don't know what the fusion tradeoffs are.


I'm quite sure a good number of these are running partly on GPU's. People are quite aware of them; their practical use in research applications is still a bit limited because of how often the code needs to be modified -- but in industry where the same basic model is run over and over they (as well as FPGA's) are pretty common. Not every computational application is the same :)


well, it's not the job of a physicist to understand the many many different sorting algorithms, how to optimise loops so they are cached in the cpu, and plenty more.

I guarantee that most CS professionals (CS; not necessarily developers) are able to write much more efficient code than most professional physicists, because they spent the same amount of time understanding software and hardware that these guys spent building a fusion reactor.

that said, considering the billions that have been spent on fusion research, I doubt they did it in a bubble. it would seem strange if they didn't enlist the help of some CS


Not sure about this project, but a lot of physics simulations are written by computational physicists, who don't really do actual lab work, and instead spend a lot of their time coding and optimizing. These folks are often aware of things like caches or branch prediction penalties or gpu's (mostly I see MPI codes running on largish clusters though). This doesn't mean that there is no room for optimization, most likely there's some because I am yet to see code that doesn't leave any, but it's probably not something an average CS expert would easily find.


CFD is also used in computer graphics (games and film) which is filled with people who focus on low level optimisation.


> Even to this day, 2D simulations are fairly common and littered throughout the literature, and their justification (arguments of approximate symmetry) are suspect, at least to me. 3D sims are coming on the scene, but they often have to sacrifice things like spatial resolution that hurt's the simulation's physics in other way.

Sounds like the joke about the guy who dropped his keys in the bushes but is looking for them under the streetlamp, because there's light there so they should be easier to find.


It seems like they've done all of this at a surprisingly low cost.

1 million assembly hours at 370 million project cost = 370 Euros/hour (assembly + management overhead costs).


I think one important price factor is that this is "only" a plasma containment experiment and will only run Helium and Hydrogen, not the Deuterium-Tritium-mixture needed for energy-generating fusion.

This means it generates way less neutron radiation, which in turn means less need for shielding and easier working conditions on and close to the machinery, avoids material degradation, ...

Also, if I understand German sources correctly the 370 million doesn't include wages of the people working there (maybe external construction contractors, but not the effort by MPI personnel). (EDIT: found a quote of ~1 billion EUR as cost for the entire project since 1995)


Good call on the actual price. Still 1/13th the current cost of the not-yet-operational ITER... Makes me wonder if the stellerator concept will be more feasible in the long run.


I don't think you can draw any conclusions from comparing the price of Wendelstein 7-X and ITER. ITER is much larger than Wendelstein, it looks like the magnets in ITER are roughly 10 times as heavy, so ITER needs much more material than Wendelstein.


This is an earlier-stage demonstration than ITER. A fairer comparison would be with the cost of JET.


Wow. At my university group, we're continuously trying really hard to get funding for our research, where we ask for less than 1 percent of that budget. It's really, really hard to get because the landscape is quite competitive. But such a project, if we'd ever get one, could fund a co-worker and me for three years plus a bunch of partners at other universities.


3.7m euros would last 3 years? That seems... quite expensive.


It does, that's why typically we only ask for less than that in our grant proposals.


The assembly was done by Polish engineers mainly. I bet they were doctors paid by university, but even if not, top world class engineer from Poland will not cost more than €30/hour. Add management overhead of another that much. This is not a lot.


I bet a lot of the work was performed by academic indentured servants.


Academic indentured servants that absolutely knew this would be the case but were no doubt tripping over each other to get a spot on the project.


That doesn't make the institutionalized exploitation of their labor any more ethical or palatable.


This is absolutely awesome. So many people coming together to design and construct such a complicated experiment. Excellent work and let's hope it teaches us how to do fusion.


Site was down for me so, mirror:

Fullpage screenshot:

http://i.imgur.com/QIj2NWk.jpg

Google Cache:

http://webcache.googleusercontent.com/search?q=cache:5oATeaa...


What's striking about that is the investment costs were €370m. That seems like a tiny expense from the perspective of the EU and US budgets. There doesn't seem to be any sense of urgency in funding fusion research.


This is just one, relatively small experiment. For comparison, the construction cost of ITER (which is still unfinished) are $14 billion so far.

The US Fusion Energy Sciences program is over $400m a year (source: http://science.energy.gov/~/media/budget/pdf/sc-budget-reque...).


I know, but even $400m / year in a $13T economy is a pittance. Without even discussing the looming costs to clean up global warming, etc.


There's been a lot of funding for fusion research over the past half-century, but politicians in charge of funding have grown weary of "Real soon now" promises.


Out of curiosity, how did you manage the full page screenshot?



You can find the addons in your browser's extension/addon store.


I feel that rarely does real life look more sci-fi than science fiction, but the design of this device is out there.


That's because the design was procedurally generated by a computer!


Even the name..


It's named after the eponymous mountain in Bavaria, referencing Princeton's "Project Matterhorn" where early work on fusion was done


Wikipedia link for context: https://en.wikipedia.org/wiki/Wendelstein_7-X


I understand that this is not meant to ever be a power-generating device. It is meant merely to demonstrate that it's possible to sustain a hydrogen plasma for 30+ minutes.

Is there any idea what scale of power generation we'd eventually be able to make with a system of similar size in the future?


On a per mass, or per nucleon basis, fusion wins hands-down: one gram of deuterium results in 10^12 J of energy, or 275 million kcal. Fission gives a comparatively small 20 million kcal per gram of 235U. So fusion is over ten times as potent. Keep in mind that chemical energy like that in fossil fuels is capped around 10 kcal/g. - See more at: http://physics.ucsd.edu/do-the-math/2012/01/nuclear-fusion/#...


The video says, that the magnetic cage has been optimized by a super-computer.

Does someone know, why it looks like it looks? Why is it circular? Why are the path and the surrounding magnets twisted like this? Please ELI5.


I'm no expert, but the stellerator design makes confinement of the plasma much easier and energy-efficient. AIUI, the twists ensure that the particles making up the plasma are evenly affected by the magnetic forces from the coils.

Hopefully someone who knows what they're talking about will be along soon ...


A bit late to the party but I found this video illuminating: https://www.youtube.com/watch?v=vqmoFzbZYEM

Basically (if I understood correctly) it has to do with confining the plasma within the torus without the need of an electrical magnetic field (which is what the Tokamak uses).


Can anyone explain tl;dr style how this will produce clean energy? I understand the idea, but how fuel is being produced because it sounds they use small quantities of helium which is easily accessible?


They're using helium to scale up and test that the various systems work. The actual fusion will likely be done on deuterium and tritium, which are hydrogen atoms with 1 and 2 neutrons respectively. Hydrogen is readily available via electrolysis but you have to get the isotopes of it to serve as fuel.

Fusion reactions of this sort are quite clean. The actual reactor will be bombarded by neutrons so parts of it will become radioactive but the effect should be much less than a conventional fission reactor.

edit: as for how the actual energy production works, you get deuterium and tritium up to very high temperatures and get the atoms to collide. The kinetic energy in these reactions will be so high that electrostatic repulsion will not be able to prevent the collision (normally, like repels like, but this force can only stop so much energy). once the two collide, the strong nuclear force takes over and forms helium and releases one of the neutrons plus a whole bunch of energy


How do we harness that 'whole bunch of energy' from a fission reactor? Is the plan to run a heat exchange to produce supercritical steam?


Sorry, I meant to type 'from a fusion reactor', since that's what we're talking about here.


Great, so to produce the fuel you can use energy from reactor, so it is self serving, right? I cannot find any simplified information on how dirty this is, what it takes to produce energy and what is the waste. It just says its clean and revolutionary.


Deuterium is readily available- you can get it from sea water if you've got the right equipment and enough patience. Tritium is more rare, but you can actually use those released neutrons from the reactor to make some, if I understand correctly. Either way, you only nees just a tiny amount of fuel- fusion reactors leverage e=mc^2 pretty well

Unlike fission reactions, there are no leftover radioactive byproducts. There's nothing to bury and there are no carbon emissions (like e.g. coal). The reactor itself does become radioactive but this should be a much easier problem to deal with.

Fusion is the transformation that powers the sun. Harnessing nuclear fusion is one of the ultimate acts of power over nature that man can do.


Great, explanation like that should be associated to majority of their articles. Sounds like a great thing so good luck to them!

Thanks for explanation!


This is likely because they have a science bias, not a product bias. This particular team and design will never produce more energy than it will consume. It's specifically made to validate some ideas about reactor design. Fusion reactors become more energy efficient as you scale them up, but also more expensive. As a result, all such designs have to be vetted through a series of cheaper but "nonfunctional" designs.

It would be hard for them to extoll all the virtues of this without knowing if this design can work. One possible outcome from this test is that the reactor design, the stellerator, is not presently feasible for power plant design. That would be disappointing but at least they would have paid relatively less to figure that out.


Yes - with enough electricity electrolysis is easy, it would likely run on seawater.

There are no "fission fragments", nothing heavily radioactive. There is significant neutron radiation while the reactor is running, which over time will mean the reactor components themselves become low-level radioactive - but we're talking the kind of thing that's dangerous for decades, not centuries. The tritium fuel is somewhat radioactive but would float away quickly if released as it's so light.

Fusion can only occur under active containment; if the reactor loses power or the containment system fails then all you have is a pile of mildly radioactive plasma (which is not exactly great, but as above, if released into the atmosphere it would fly off into space, there's no risk of it finding its way into rainwater or anything like that). There's no "meltdown" scenario even theoretically; the physics just doesn't create that kind of possibility.


One tiny nit -- the plasma is hydrogen which burns to create water, so there's every chance it will come back as rainwater, it's just not terribly concerning.


Does that mean fusion reactors could generate "excess" water which could be used to treat drought-affected areas? Sounds like an eventual win/win!


The amount of hydrogen produced is minuscule. The energy released is given by E=mc^2 where m is the difference in mass between the Deuterium and Tritium atoms and the resulting Helium atom and neutron. Even a hydrogen bomb doesn't create a significant amount of water (consider that it weighs a few tons, and only a tiny proportion of its mass is converted).


There are any number of wonderful things you can do with a supply of cheap electricity :). I believe desalination is largely uneconomic at current electricity prices but used a little in Israel?


Could this also be used to burn waste? There are already experiments witch plasma gasification but those temperatures are much lower (13000°C).

Maybe some day we will all have a Mr. Fusion Home Energy Reactor in our cars[1]

[1] http://backtothefuture.wikia.com/wiki/Mr._Fusion


When I was growing up, this is the sort of stuff I dreamed we, humanity, would be doing in this day and age. This is the stuff Star Trek is made of (probably inspired by). Despite all the madness in the world today, I am glad some others share my childhood dream.


AFAIK, the Star Trek is using matter-antimatter reaction in the warp core [0]. We are however really far for this technology. Fusion on the other side is reachable.

[0] http://memory-alpha.wikia.com/wiki/Warp_core


Wow. So complex and tightly packed. I wonder if they have any information on how serviceable the unit is, or what its operating life-cycle might be. In any case, a fascinating achievement, this is a historic day certainly.


Does anyone here know the theoretical (usable) energy output they can expect to get from this device?


Zero. The Wendelstein stellerator is not intended to be energy-generating; I don't think they even intend any fusion ever to happen in it.


They certainly intend fusion to occur. Otherwise it's a scientific device; not a power plant.


It is a scientific device. See http://www.ipp.mpg.de/16931/einfuehrung which, although it's ambiguous as to whether any fusion at all is intended to happen, makes it clear that actually producing energy is not a goal. http://www.ipp.mpg.de/17064/strahlenschutz does call it a "fusion device" and suggests that after a few years of operation they'll be putting deuterium in and doing something that produces neutrons, so I guess there must be a plan to do some actual fusion. But, still, intended energy output: zero.


They do not intend fusion of the type you'd run in a power plant to occur. There will be some fusion processes happening, but as far as I understand the sources that's an unavoidable side effect of running Deuterium plasma. It is a plasma containment experiment, not a reactor test.


Great achievement. Let's hope we see first fusion reactions and then sustained fusion soon.


Cheaper power will make bitcoin mining profitable again.


Site got hugged to death. Needs a fusion powered server.


Maybe a re-write in ColdFusion? ;)


the readership base over-reacted.


coldfusion!?


or they could host it on steam

(now that's a bad joke)


Very exciting.




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: