"100M degrees" (Kelvin) corresponds to 10 KeV (kilo electron volts), which is an important figure to exceed for D-T fusion. D-T fusion which is the kind of fusion the ITER Tokamak (a forthcoming fusion reactor and international megaproject) intends to demonstrate.
An older fusion experiment, JET (Joint European Torus) reached these levels, so this does not break new ground, but it is important if this Chinese Tokamak is going to provide data useful for ITER.
I will note that it's rather unusual to refer to plasma temperature in Kelvin rather than in KeV. I edited this comment with a few more details to try to make it easier for laypeople to understand.
Off topic, but this distinction made me laugh. Like the difference between Kelvin and Celsius would throw everything off.
Not that I'd be that stupid.
My aussie friend who was a scientist said when he was in the us for about 15 years that he still used c for temp in lab but F for weather and it took him a while to realize that he had flipped and the 2 things had basically no relationship in his head.
I did not know that. I've always thought it was originally defined as units Kelvin, without degree.
But then again I just noticed that I was off by one in my parent comment so I shouldn't through any stones I suppose.
From ITER's wikipedia page:
>The goal of ITER is to demonstrate the scientific and technological feasibility of fusion energy for peaceful use. It is the latest and largest of more than 100 fusion reactors built since the 1950s. ITER's planned successor, DEMO, is expected to be the first fusion reactor to produce electricity in an experimental environment. DEMO's anticipated success is expected to lead to full-scale electricity-producing fusion power stations and future commercial reactors.
And from DEMO's wikipedia page:
>As a prototype commercial fusion reactor, DEMO could make fusion energy available by 2033.
Also, I can imagine it's a joint project only partially because we can share the cost, I imagine another reason to work together is so that no one country gets this technology first.
http://www.iter.org/faq (several are relevant).
If by "we" you mean US's share, that's 9% of total costs. China, India, Japan, Russia, South Korea, and the US are paying 9% each and EU is paying 46%.
$60bn is actually surprisingly little for research that could change the future of energy production and possibly society as we know it. To put it into context, the Apollo program cost $200bn in today's money and a high speed train between LA and SF is projected to cost $100bn.
That's not true unfortunately. Yep it's planned to be attached to the grid, but it won't be a production-ready power station. That will be PROTO .
Basically the whole schedule slipped more, US pulled out of ITER so they had to scale it down, then it was delayed, so as things stand now DEMO will still be a testbed. Some recent DEMO design notes can be found here . A bit dated, but if anything an optimistic outlook can be found here  at page 8. Note that DEMO is though to "resolve" some issues still.
https://en.wikipedia.org/wiki/ITER: ”Initial plasma experiments are scheduled to begin in 2025, with full deuterium–tritium fusion experiments starting in 2035.”
So, according to Wikipedia, DEMO will build on ITER’s results, but will produce energy before ITER’s first real fusion experiment starts.
More money means you can buy better gear, hire more workers, complete projects faster and run multiple parallel sites to complete various goals at the same time.
Of course there is some point where adding more funding will not advance the speed as much anymore but I doubt were even close to that point at the moment.
Fission on the other hand could report a lot of results and success and, at the time, seemed to be infallibly safe.
- Oil/coal/natural gas lobbyists and interests which hinder tax payer funded research.
- The fact that there is no guarantee we will ever figure it out and no idea whatsoever as to how much it will cost to figure it out. Investors like returns, in their lifetime, leaving largely tax payer funded research as the greatest source of funds... see above.
And then with government-funded research... if a government figures out fusion, what do you do with it? Do you license it to private industry? Do you make state-owned power plants?
If you give it to private it industry, it's going to get to other nations. If it gets to other nations, you lose non-electrical power and create potential strategic issues, which means you are motivated NOT to share the technology.
I wish we could all just get along, fund stuff like this and space exploration, and get over petty politics before our species goes extinct.
Utility-scale PV now costs only $43/MWh. Investing in developing fusion reactors makes very little economic sense compared with capturing the output of the fusion reactor we already have.
The research should still be done, of course. It can have benefits to a future interstellar civilization - but until we're interstellar, PV is far, far more compelling.
A high temperature plasma represents a continuous supply of fusing atoms. The current research at ITER, this place, etc. are attempts to create a persistent environment for fusion. If they can do that, then it creates an environment where research can focus on 1) reduce the energy required to hold it at that temperature (which includes limiting how much plasma leaks out, since leaking plasma drops the temperature), and 2) work out ways to extract the energy created by fusion.
As I understand (and I could be wrong, it's been years since I last read about it), ITER plans to generate a net-negative energy situation (i.e. it'll never produce energy, just consume it) but hopes to create a sustainable plasma field at temperatures that cause fusion.
The more important work on ITER is work around enabling actual power stations using Q=5 and developing the tech to maintain operating fusion reactors (remote robots, etc.)
DEMO, to my knowledge, will then include an actual electric generator to be hooked up to the fusion core.
The temperature of a gas is essentially a measure of the constituent particles' kinetic energy. Higher kinetic energy = higher temperature. 10keV represents enough kinetic energy for the D-T atoms to collide fast enough that they overcome the Couloumb repulsion and fuse together.
In addition to that, whether two nuclei fuse is also dependent on how squarely they collide. A glancing blow intuitively allows both nuclei to push each other away a lot easier than if they experience a head-on collision.
A graph of the fusion rate such as  shows that even at much lower average temperatures, fusion will occasionally happen when two higher-than-average nuclei collide head-on. As the temperature gets higher, this rate increases as more particles have enough energy and less require those head-on collisions. The peak rate for D-T according to this graph is about 70kEv.
I can't speak to why the grandparent's link referenced 100kEv, as it's been a decade since I last studied this and I'm very rusty.
What's the difference between "hard" and "fast"? What does pushing "hard" mean?
Conceptually, there's two main methods of fusion - inertial confinement, and magnetic confinement.
In inertial confinement, lasers are shot at the plasma to squeeze it together so the pressure (and thus temperature) increases until fusion occurs. This is also what happens in stars, except they use gravity not lasers. Conceptually, this is what I alluded to when pushing it 'harder'
In magnetic confinement, the pressure/temperature is increased by shooting electrical currents through the plasma among other techniques that i'm not as familiar with. The volume hasn't really decreased, but the kinetic energy and pressure of the plasma has increased so it's the same principle, but this is conceptually what I alluded to when i said pushing it 'faster'.
In either sense, the idea is to somehow increase the plasma temperature which increases the kinetic energy of the particles enough that they overcome the electrostatic barrier. It was just really badly worded by me and I apologise for that!
Fast -> High kinetic energy since they won't acquire mass this is basically speed
Hard -> More attempts to make it happen (higher particle interaction)
But I might be wrong since plasma is weird stuff.
SciFi likes to talk about antigravitation. But supergravitation would be cool as well :)
I don't know much about fusion, but my guess is the 10keV is impressive because it is a self-sustaining fusion reaction rather than being impressive because of the absolute energy of the reaction?
edit: someone down the page mentioned that containment is the issue. In a CRT you are just accelerating an electron across a few thousand volt potential and slamming it into the screen.
Right, and that was my understanding. It's easy to accelerate a particle or a stream of them to high energy. It's another thing entirely to contain a gas/plasma at those energies. The distinction becomes more obvious when they flip back and forth between impressive sounding temperatures and simple KeV measures.
In order to keep the plasma at the temperatures where fusion can occur, rather extreme measures have to be taken. In the Tokamak approach, the plasma is placed in a toroidal vacuum chamber, and "suspended" in the center of the torus by using electromagnets that line the Tokamak chamber's walls. At such high temperatures the plasma is so energetic that it is very hard to contain such fast moving particles. If the plasma "escapes" the confinement and contacts anything (ie. the walls of the Tokamak) it rapidly cools down to temperatures below where fusion can happen.
The immense engineering challenge here is to heat plasma to ridiculous temperatures, and keep it confined in a very small volume at great temperature and pressure to mimic conditions that give rise to nuclear fusion in the center of stars.
I see Wendelstein 7-x is attempting 30 minute burns soon https://www.ipp.mpg.de/4413312/04_18?c=4313165
This is not exactly true. Inertial confinement fusion has conditions that are similar to stars. The engineering challenge for magnetically confined fusion to keep the low density plasma confined for long time durations for fusion.
For anyone interested in further reading, look up the Lawson Criterion.
Sounds relieving. I used to think that «if the plasma "escapes" the confinement and contacts anything (ie. the walls of the Tokamak) it rapidly…» disintegrates everything around or, when the power is huge enough, causes an apocalypse…
Nuclear fission reactions can continue on their own for quite a while. This is one of the reasons they can be so dangerous.
 Unstable in the sense that it is hard to maintain fusion conditions, not in the Hollywood sense that it blows up if you look at it sideways.
- When two hydrogen nuclei combine, they produce an enormous amount of energy. That process is known as nuclear fusion.
- Light nuclei have to be heated to extremely high temperature, it is challenging to create a controlled, safe fusion reactor that offers more energy than it consumes. Once we have such we’d have a near-limitless source of clean energy.
- Nuclear fusion does produce radioactive waste. However, in contrast to fission produced wastes, they are short lived and decay to background levels in a very short time.
- Tokamaks try to do just that.
- While the products of the fusion reaction are short-lived, operating a fusion reactor will active materials in the reactor and create some longer-lived radioisotopes.
- Unlike a fission reactor, which is loaded with months to years worth of fuel, a fusion reactor would have fuel constantly injected. So operator action to stop injecting fuel would stop the nuclear reaction.
> It was shown that wait times are required in the order 50–100 years for the remote handling recycling option and
hundreds (Li4SiO4) to thousands (Eurofer) years for hands-on handling.
In some dystopian future, an AI figures out this is a most efficient use of matter and the entire Earth gets used as fuel for fusion reactions
(somebody posted that video on another recent HN fusion thread)
>The ARC design aims to achieve an engineering gain of three (to produce three times the electricity required to operate the machine) while being about half the diameter of the ITER reactor and cheaper to construct. (Wikipedia)
It's not been built because of the $5bn or so cost, though given global warming / saving the planet type issues I'd be happy enough as a tax payer to have governments fund one and maybe knock 1% off the defence budget to counter that. It'd probably do more for world peace than churning out some more f35s.
Yes we have been building dozens of these. But that's also because it's fundamentally a very difficult project. Do you somehow expect airplanes to go from the Wright flyer to a jumbo jet in less than a dozen steps?
There's also been plenty of research into alternatives. Germany is building a stellarator.
There's also a whole cohort of startups looking at more speculative ideas.
I'm not saying there can't or shouldn't be more research into more alternatives. I am saying it's unreasonable to look at tokamak research as some sort of dead horse we're flogging.
On this note, do we have any reason to be particularly confident that magnetic confinement will ever break even and produce surplus energy? In nature fusion seems to occur through gravitational compression, so what makes us sure that we can simulate this by other means that will ever amount to more than just demonstrations?
"The ITER thermonuclear fusion reactor has been designed to produce a fusion plasma equivalent to 500 megawatts (MW) of thermal output power for around twenty minutes while 50 megawatts of thermal power are injected into the tokamak, resulting in a ten-fold gain of plasma heating power."
Edit: I think this is what I meant: https://en.wikipedia.org/wiki/Lawson_criterion
I hear sometimes contradictory hear-say on the lines of "unlimited energy", "reactor would have to be fed constantly".
If the cost is low, a brave new world with fusion replacing fossil generation as quickly as they can be built. Energy-inefficient processes like desalination and cracking water for hydrogen become attractive.
If the cost is very high, it may be that renewables have stolen fusion's market slot. In that case we'll see some national prestige projects, but against a broader renewable energy market.
Fusion might end up looking a lot like fission, but hopefully with a lower perceived safety risk and thus more public acceptance.
Granted, modern fission designs aren't actually unsafe, but that doesn't matter for PR purposes.
— random guy on Quora
Did the reactor produce more energy then was put into it? I just don't understand enough about the field to figure that out by reading the article.
The technical term for a "doughnut-shaped area" is "torus"; "tokamak" is an abbreviation for the Russian for, "toroidal chamber with magnetic coils".
I.e. going from sensor readings to inference of the plasma state.
I thought the stellerator had already achieved 100M Kelvin?
Now, I don't expect politics to allow sharing of fusion energy to help other countries.
You are very wrong. ITER is a $20B international collaboration to construct an energy positive fusion reactor. The participants, including China the underwriter of the experiment detailed in this thread's article, are very much sharing.
But yes, the first ones to 'crack' the problem will have a head start in the commercial fusion power plant market, but I don't think it will last very long. As you say, most of the research is being published, and even if somebody manages to initially keep that final 'dot on the i' secret, it wouldn't take other researches long to figure it out.
Whilst a 20-30% reduction in energy cost would be great. The really important change is that it is much more scalable and easy to get. You don't need to dig up coal, or drill for oil. All of which can be limited. With a working fusion reactor a country like Singapore can be energy independent, in a way it never could with any other type.
This independence would mean no more need to mess around in areas with these resources (think middle east) or have your countries energy rely on a third party you'd rather not rely on (think Germany and others and their reliance on Russian gas)
 A Piece of the Sun: The Quest for Fusion Energy - throughly enjoyed this book.
For instance, it is chemically possible to combine water (either atmospheric or from a normal water source) with atmospheric CO2 to chemically synthesize hydrocarbons. It just takes energy. So you could have a single, absolutely massive fusion plant next to an atmospheric fuel refinery and use the hydrocarbons for fuel storage and distribution. You could do likewise with hydrogen and oxygen if you wanted to make fuel cells or rockets. And this would probably be cheaper than refining fossil fuels. You could actually run OPEC out of business this way, make the electric car obsolete, make airlines carbon-neutral, etc., etc.
Climate change? Just extract the atmospheric CO2 using cheap fusion energy and turn it into an easily sequestrable form. Nitrogen fertilizers? You can make those from the air too. Drought? Use fusion power to run desalination plants. Arable land becomes a non-issue with fusion because vertical farming becomes easy. Every city could just grow whatever they needed in exactly the right climate conditions in a set of enclosed vertical farms, though they might not need to because energy will be so cheap that there's no problem shipping the stuff from Honduras anyway.
Computation scales with power, too. More power, more computation, except computation produces heat, which you need even more power to chill.
I do disagree with you on Hydrogen fuel cells replacing electric cars though. I think battery powered cars will be better than hydrogen powered ones. The power storage is more reliable, and the power transfer is more simple (lines as opposed to truck delivery or pipeline. And if we assumed that it would be done on site at stations, then they would already have power lines, so just use that!)
Not what I meant to imply. You’d just use old-fashioned hydrocarbons, except reconstructed from atmospheric CO2 and water.
It's probably not that easy; the reactors are extremely complicated and expensive to build, and I'm sure operating them isn't cheap either.
And the one thing I haven't heard much about yet is the yield - how much energy can it generate vs how much will it cost to run.
I don't think it'll be economically viable. I'll be happy to be proven wrong though.
I don't think current designs like the one discussed here will be. But I'm hoping it will lay the groundwork for much more useful reactors in the future.
With all the talk about the LHC possibly producing mini blackholes or magnetic monopoles that could potentially cause protons to decay spontaneously, I don't have enough nuclear physics background to know whether we are inherently safe, or if there is a real risk here.
Basically it's really really safe as far as byproducts. And yeah, it's a teeny sun, that instantly goes out if you stop feeding it juice, which sounds bad, but it's all the tiny stable particles, not the big slowly decaying scary ones.
If a fusion reactor is breached, the plasma will likely dump it's thermal energy into the air (likely this will cause a minor detonation, it's strength depends on the energy in the reactor and the amount of fuel). Additionally it'll leak some short lived isotopes and maybe create a few long live ones.
All in all, a detonated fusion reactor is likely save to walk in the same year it exploded, if not significantly earlier.
As for some unexpected pathway, energies like this are reached in stars all the time, and they don't spontaneously explode on a regular basis. If you're talking about a basic science experiment being the Great Filter -- well, maybe. I'm skeptical.
Well, they kind of do…
But really, the only reason why stars manage to stay in hydrostatic equilibrium is because of gravity, which we really can’t do on Earth so we settle for fancy magnetic confinement vessels.
On global scale it negligibly increased ever-present background radiation. It's not a good thing, but far from poisoning.
The bomb tests did increase background radiation for us; irresponsibly so in my opinion. But so does burning coal. We have been lifting the level of background radiation over the natural level for centuries by now.
I don't see how sunk steel would be any more valuable than steel from freshly mined ore. But I like to be surprised about these things :-)
Steel production uses air from the atmosphere. Thus it picks up the increased background radiation while it is refined. It may be possible to scrub the radioactive components from the air to avoid contanimating the steel, but I expect the cost would be prohibitive (at the very least, more expensive than getting it from old battleships).
Try pointing a Geiger counter at a brick or ever better plaster wall and be surprised about the amount of radiation you get during the day.
Seawater is about 0.04% potassium, so this is about 5.35 * 10^14 ton of potassium.
Potassium contains naturally occurring radioactive isotope (40K): the radioactivity of potassium is 31 Bq/g. Hence the natural radioactivity of all potassium in seawater is 1.66 * 10^22 Bq.
For comparison, again according to Wikipedia:
> In May 2012, TEPCO reported that at least 900 PBq had been released "into the atmosphere in March last year  alone"
...which is 9 * 10^17 Bq, or about 1/18,000 of naturally occurring radioactivity in the ocean due to potassium alone.
(We didn't even start on stuff that are commonly considered "radioactive", like all the uranium and thorium lying beneath where you are.)
* Meanwhile we're busy burning fossil fuels, increasing the amount of CO2 in the atmosphere by ~30%.
As a power source fusion is pretty good, which is why it’s such a target for research.
D-T fusion, which is the easiest to achieve and perhaps sustain, directly produces He nuclei (that is, alpha particles) and rather energetic neutrons at 14 MeV. Those neutrons, aside from being a form of ionizing radiation themselves, are bound to transmute some of the surrounding material into radioactive isotopes. So, I don't think that "no toxic or radioactive byproducts" is correct. However, the results are easier to handle than those of fission reactors.
You can't dismiss the technology based on that incident. Just like we don't ban cars because a lot of people don't operate them properly.
As far as I remember from the news at the time, the tsunami was terrible, but not unprecedented. If this obvious risk was ignored, what other risks are being ignored elsewhere?
I don't know the facts of the tsunami whether it was unprecedented or not, but considering the magnitude of the quake I'm guessing it was one of the largest tsunamis to ever hit Japan. This is speculation on my part.
They didn't ignore the risk of tsunamis, they had precautions against them but they weren't up to par. It was a series of malfunctioning safeties that caused the accident. The backup power generators conked out, and the backup to the backup was washed away by the floods. And the floods only managed to get that far because the protective walls weren't enough.
"Build better walls" seems like a trivial problem to solve, don't you think?
International regulatory bodies could also be more proactive in finding these flaws prior to accidents.
It's not a hard problem to solve in the long run. It'll be easier and quicker than finding a viable non-nuclear energy option anyway.
He was referring to the core meltdown, not the very specific fault mechanism. I should have made it clear. Otherwise, it would be an uninteresting technicality.
The entire Fukushima incident makes me suspect it's a result of defining an exact fault model and then optimizing to the model. This way, if the fault slightly exceeds the model, the result is not graceful degradation, but catastrophic failure.
Can we put the generators below sea level? sure, the sea wall is high enough, no need to worry about that at all.
> "Build better walls" seems like a trivial problem to solve, don't you think?
Yes, but is it a robust solution?
The earthquake itself killed over 15000 people.
Imo it's a massively overblown disaster. Yes, it's bad, especially the environmental effects, but it's absolutely nothing compared to the earthquake and tsunami itself.
It is also irrelevant to compare the earthquake and tsunami to the nuclear disaster. Earthquake and tsunamis are unavoidable natural disasters, but the Fukushima Daichi disaster could have been easily avoided.
You're also ignoring the massive costs of the disaster and the monumental scale of the cleanup. The official costs put it at $188B and counting.
The Japanese built Fukushima to standards that they felt were acceptable, and they were wrong and now the ocean is being polluted for the next thousands of years with the continuous risk of things getting worse at Fukushima.
In the same manner, they may be building this fusion reactor or LHC with what they feel is acceptable risk, but could they be wrong, with extremely catastrophic results? This is something I don't know but would love to know the answer to.
Then don't spread FUD?
The fact is, fusion generates neutron radiation that destroys the reaction vessel, making it an unviable technology. Nobody takes it seriously as a source of energy, aside from uninformed people. As cool as the idea of controlled fusion is, it is and will remain science fiction.
And really, you just sound like every crank ever who thought X technology was totally unfeasible and always would be -- until it wasn't. So currently attempts haven't found a solution to the reaction vessel destruction problem. That does not mean someone in the future couldn't figure that one out.
What this does is ensure that even operating right at the limits of neutron damage to the wall materials, the volumetric power density of a DT fusion reactor will suck. And that will destroy the economics.
This is a regular tokamak design, with a high chance of success since we understand tokamaks very well at this point. Various startups have more speculative designs that deal with the issue in other ways.
The power density of an ARC reactor will be around 0.5 MW/m^3. In comparison, the power density of a PWR reactor vessel is 20 MW/m^3.
Replacing the entire inner vessel once a year would be an operational nightmare. For one thing, it ensures the building the reactor is in will have to be very large, with very large secondary bays where the intensely radioactive material of a spent reactor vessel can be moved and disassembled (generating radioactive fragments and dust).