It is an important milestone. But to have a commercially viable fusion reactor, you'll need a factor of 50-100 more energy out than in to make up for inefficiencies in electricity generation using this kind of scheme.
The real story here is that this facility allows the US to do nuclear weapons research without violating the nuclear test ban treaty. If the goal was to develop a commercially viable fusion reactor, the $3,500,000,000 spent so far could have been put into projects geared towards small scale fusion experiments investigating novel confinement schemes.
Finally someone that's actually in the fusion field. Could you say whether or not the NIF research represents a practical path towards a fusion based power source? My understanding is that it would at least be capable of capturing some much needed high pressure empirical data necessary for validating theory.
On a side note concerning the NIF funding, the US gov will spend huge sums of money studying Nuclear weapons (including their disposal) regardless of the NIF project. If it is at all possible to combine that research with other fields then really it's a perk. I'm not justifying the expenditure, simply attempting to frame the situation in a different light.
Finally, as a fellow scientist with funding woes, I feel your pain.
I've worked in ICF for a nearly a decade. I think NIF research is a necessary step to a ICF based power source. If NIF demonstrates ignition, it can be used to validate our physics models of ICF (turbulence, fusion product transport, thermal conduction, etc). These physics questions are shared by all ICF approaches. However, a fusion power plant will use entirely different laser technology , and most likely a different target design . After showing that the physics works, there is still a lot of engineering work needed before building a reactor.
 An ICF power plant will be pulsed at 1-10 Hz. NIF is a flash-lamp pumped glass laser, which takes ~12 hours to cool between shots. A power plant would likely be diode-pumped sold state laser since these can meet the required repitition-rate.
 The indirect drive target that LLNL is pursuing on NIF is not very efficient. You spend a lot of energy heating the hohlraum. Directly driven targets (blast the capsule directly rather than heating a gold can to make x-rays) should be much more efficient. There are also several ideas for ways to ignite a target more efficiently (shock ignition, fast ignition), but these need additional laser hardware.
@sam Though I work in ICF, I was sad to see the innovative confinement concepts (magnetic confinement) cut a couple years ago. I think it is short sighted. Fusion need to work and get smaller and we should keep our options open. Hope you managed to get a thesis out before the walls fell.
Good to see a commenter with an ICF background (most of what I know of ICF was only from water cooler chit-chat).
I didn't manage to get a thesis out. I left the PhD program before the funding was cut, but the writing was on the wall.
Yes, it is short sighted to cut funding of small scale plasma confinement concepts. They are high risk / high reward projects which are not expensive ~$1M. And even if they don't pan out as viable confinement schemes, they make for great training platforms for graduate students.
I thought the goal was to have a self sustaining reaction that would only require laser pulses to hold it in place. Is a continuously pulsed laser necessary to provide thermal energy to drive the reaction?
Also I've read (sorry no citation) that the indirect drive was necessary to ensure even dispersal of the heat generated by the initiating laser blast.
Any clarification would be much appreciated, this isn't my field.
ICF is intrinsically pulsed (magnetic confinement like ITER is completely different). You use a laser to spherically compress a 1 mm diameter spherical capsule. The capsule implodes, stagnates, and blows (releasing energy in the explosion). Then you do do it again 0.1 seconds later.
NIF uses an "indirect drive" design. Instead of directly illuminating the spherical target with a bunch of lasers, you blast the inner surface of a gold cylinder with the shell at the center of the cylinder. The cylinder gets hot, emits x-rays, which are absorbed by the capsule. The x-ray drive tends to be "smoother" than direct illumination.
The big problem with ICF is hydrodynamic stability. It is like trying to squeeze a water balloon with your fingers. If you don't squeeze it perfectly symmetrically, it will squirt through your fingers and pop rather than getting compressed by a factor of 20.
>The big problem with ICF is hydrodynamic stability. It is like trying to squeeze a water balloon with your fingers. If you don't squeeze it perfectly symmetrically, it will squirt through your fingers and pop rather than getting compressed by a factor of 20.
My layman's understanding of the subject is that you want (and get) anything but a self-sustaining reaction with fusion, and that's actually one of its safety features compared to fission (where a chain reaction is allowed to happen in a controlled way -- if something goes wrong the reaction keeps going, out of control).
You do want to use the energy from one reaction to power the next, but not directly. No one's actually trying to create a star on the surface of the earth.
As with you, though, not my field so someone who knows better can feel free to correct me.
Not exactly. Off the top of my head I can't think of a reactor thats unstable in operation. Hands off (like that TV series life after humans where the people all disappear) almost all stabilize and eventually shutdown on their own. Maybe those crazy graphite reactors are not inherently stable.
There's two linked issues.
Fission reactors get about 10% of their heat from decay products (the "waste") decaying away. That means there is no instant off switch. The "off" position is still 10% out for hours/days/weeks (well it decays away eventually...). As the Japanese found out the hard way, 10% of a huge amount of power is enough residual heat to cause an awful disaster.
The other is surface area / volume ratio. Couple gigawatts thermal like the Japanese and there's not enough surface area to cool it without giant pumps when shutdown. Couple MW like a nuclear sub and the surface area ratio is better, theoretically you could probably walk away from one without anything awful happening. The people who know aren't talking. Fusion you're talking about something a millimeter across not multiple meters. Walk away or have a computer crash or whatever and it seems impossible for something that small to cause much damage.
There are some practical physics reasons why making a fusion reactor the size of the sun is really easy and the smaller you go the harder it gets, but there are practical engineering reasons why making one bigger than a millimeter would be a huge PITA. On the other hand if you could make a fission reactor the size of a millimeter that would certainly solve a lot of painful thermal engineering problems, but the physics just doesn't work (long story...)
This is definitely true, and the BBC totally omitted these facts. The official NIF announcement tried a little bit to cover their tracks. They posted some flashy numbers about the total energy output from neutrons, but nestled in the bottom of the page is the total energy dumped into the entire system:
In comparison, the fusion power produced was a paltry 8 kilojoules, which they compare to the "energy required to form the plasma".
For this technology to be actually effective, there needs to be a multiplier of several thousand. As other commenters have mentioned, this is mostly just nuclear weapons development.
[I'm positive that the ICF's predecessors also achieved ~~ignition~~ fusion on smaller scales.]
The real trick is that  you need to get out more than you put in (exceed break-even)  you need to be able to harness the energy produced by the ignition and  be able to do it over and over again, producing reliable power.
You're right. In my head "ignition"=fusion, so I was auto-translating as I was skimming. They're looking for something slightly different, and probably using energy output (among other things) as indications of self-sustained fusion, which we've certainly obtained in bombs (fission explosion sets off fusion reaction ☞ big bada boom).
Are you sure you're not thinking of JET? ITER's not scheduled to be operational until 2020 AFAICR.
Incidentally when I went to see JET (it's up the road from me, not that far from Oxford) the scale of the thing was enormous. They mentioned that they'd had been able to sustain plasma for 600 seconds. I've got some pictures up on flickr.
One of my favorite stories is the disruption. This happens when things don't go just right, and all the plasma with its energy squirts onto one spot. This typically puts the equipment in jeopardy. People try to prevent it, but as they push the boundaries of what's been done, it's going to happen.
Keep in mind that there's only about as much mass in the entire chamber as in a few cubic centimeters of air; it's basically a vacuum. But what little there is is so hot, and there's so much energy wound up in the fields. When JET had a disruption, the entire machine, the largest tokamak in the world, LIFTED ITSELF OFF THE GROUND AND JUMPED A FEW FEET OVER.
The LHC beam-dumps at CERN are also at that holy-shit-physics scale. A more explanatory article also covers the superconducting magnet quenches, in which up to 10GJ of
stored energy suddenly decides it really ought to be following Ohm's law again.
"50-100 more energy out than in to make up for inefficiencies in electricity generation using this kind of scheme."
Completely true. Theoretically, the NIF target has enough fuel to produce 10-20x the amount of laser energy driving it. Once things work, it isn't a huge step to get to an energy gain (energy_out/energy_in) of ~50. That said, the real world is always more complicated and NIF has yet to ignite.
Actually, the site you linked lists 3 goals. Weapons research is only one of them. The other two are commercial and scientific research. Those are not mere byproducts, but explicit goals of the facility.
He's not saying the facility is a sham, he is simply saying it's goals are not to generate electricity, it's simply a research tool to provide data to improve empirical models. Those models of course have all sorts of uses, and certainly a big one is nuclear weapon modelling.
Yes, but you have to pay for the power plant, and being just a bit over unity won't cover the expense of the next stage in the chain.
You have to be enough over unity to pay for another plant plus surplus to actually use. In which case you can just build a bunch of plants - you don't need to chain them. There are plenty of other power sources you can use to bootstart the power plants.
Cover the bases: you can't be sure someone else won't discover a new type of nuclear weapon or something you haven't thought of. The big one would be if someone figured out how to get kiloton->megaton yields, from a weapon which didn't produce radioactive fallout.
Sure, target a city and it's functionally indistinguishable, but what about against military formations or bunkers? It would be a big game changer if someone eliminated the fallout problem, since it changes how one could plan to use nukes in a defensive posture.
So you do the research to make sure you're always within a year or two of whatever the next state-of-the-art will be. Being the dominant hegemon has a lot of cost-of-living expenses.
Its the ongoing expense. Look over time, what does it cost to maintain the ability to use one?
Inflation adjusted cost of the entire B-52 program including all maintenance and crew personnel and their training? I donno, like a billion?
A missile field is cheaper to maintain (I think?) and not much more to develop, so its probably a bit under a billion.
A nuke launching sub might cost a billion but it holds alot of them so divided out is probably only hundreds of millions
Grats you've now got one so small and light a plain old fighter-bomber plane only 10 million or so can drop one.
Finally small and cheap enough to do some million dollar class cruise missiles.
I'm sure we'll eventually have cheap drones as delivery vehicles.
The point is minimizing spent money during peacetime. A bunker full of cruise missiles is infinitely cheaper than multiple operating wings of B-52.
The other (obvious?) idea is burying history. Say Canada wanted to nuke us for invading and regime changing them to steal their maple syrup. It could happen. IF we used 1940s techniques which only require a billion or so and a couple months or a whole year, anything their spies steal is very dangerous. On the other hand, "Go ahead, steal these 1980s kryton switch designs, it'll take you 50 years and a trillion bucks to figure out how to use them not to mention all the extra stuff that goes with those handy dandy trigger mechanisms". A good analogy is the time machine. So you can visit Bavaria in the 1300s if you sneak a 1500s level wooden clock escapement that would probably revolutionize Bavaria in the 1300s one way or the other. On the other hand sneak in a modern AMD64 CPU, just the bare chip, not even a heatsink or a PCB or anything, just the chip, and they're going to be all "WTF" all they have to do is get to the 2000s or so, and this will chip will leapfrog them into the 2010s, but its going to take a heck of a long time to get them from 1300s to 2000s without any help. So, yeah, go steal that modern US design, a mere couple trillion bucks and 70 years experience and you'll be stamping them out like license plates just like us. I guess by the 2080s we should be worried, but right now no problemo.
"Lawrence Livermore National Laboratory has a mission of strengthening the United States’ security through development and application of world-class science and technology to:
Enhance the nation’s defense;
Reduce the global threat from terrorism and weapons of mass destruction;
And respond with vision, quality, integrity and technical excellence to scientific issues of national importance."
A good place to start if you want to understand the purported rationale for the NIF (stockpile stewardship), I suggest reading and understanding this light introduction to modern nuclear weapons.
(the rhodes books, Making of the Atomic Bomb is required background reading, as is Dark Sun, if you want to get into the backstore).
The NIH experiment recapitulates many of the design aspects of a thermonuclear weapon, but does so in a highly controlled lab environment.
I'm a biophysicist. I know a fair amount of engineering, although I'm not a weapons physicist. Nonetheless, after years of reading about the NIF and various fusion projects I've come to believe that there is little justification for their expenditure. In particular, we can do stockpile stewardship without this device, more cheaply, nor does NIF present an economically viable method to production of power at a large scale in even the most rosy predictions.
I still think the experimental design is cool, but I can't see this as a rational expenditure (HUGE opex and capex) compared to other investments we could be making.
Most likely scenario I see in 20 years is that china will be mass-manufacturing small, safe fission reactors and making a mint selling them to the rest of the world. That's got far less reqiurement for massive capex and opex. It's just that the western nations decided to go stupid about fission because OH GOD NUCLEAR MUTATIONS and stop investing in building more reliable, safer, and smaller plants.
The NIF is probably more useful for coming up with new fusion bomb designs than for stockpile stewardship, as you say, but what makes you say hot fusion can never be economically viable? The sun is proof that nature can do this on a grand scale! It may be a long ways off, but "a long ways" becomes "never" without experimental work like the NIF. Yes, the U.S. is probably doing this primarily to create new weapons just as the space race was really about developing ICBM's. However, the fundamental science being done has huge long-term potential. It's a sad, but true fact that, sometimes, scientists have to use the megalomania of states to fuel research that has the potential to benefit all of humanity.
Fission power is a separate matter. The single most important thing that most people still do not understand is that power grids have almost zero capacity for storing energy. That means most alternative energy sources, such as wind or solar, can never exceed a certain percentage of the grid's production capacity or it will become unstable. On-demand energy generation is still needed, and there's absolutely no reason to fear nuclear power while we're still using coal power, which is far more deadly in every sense! It's not just the West that has this fear either. Look at Japan's recent nuclear shut-down (from 30% of their grid to 0%, replaced with fossil fuels), and tell me that isn't going to influence China!
Whether the sun is capable of fusion isn't super important. We can take advantage of that, but it doesn't mean we can build many safe plants based on an evolution of the NIF design.
If your alternative fuel requires hundreds of billions of dollars of investment in the future, just to show viability of the basic research concept, before it can actually be useful, you will be at a massive competitive advantage relative to alternatives.
What I am saying, is that relative to other potential investments, fusion does not seem to be a worthwhile investment, as the risk/payoff ratio is worse.
As for the rest of your comments, I mostly agree I'm aware that alternatives like wind and solar have limits, but we can still use them to great effect (as augmenters, and ways to reduce reliance of less sustainable sources).
> What I am saying, is that relative to other potential investments, fusion does not seem to be a worthwhile investment, as the risk/payoff ratio is worse.
I disagree. There exists MILLIONS of years worth of fuel on earth, compared to a few decades of oil and gas, and a few hundred years worth old coal or fissionable uranium. That's quite a huge payoff for relatively little risk (compare fusion research funding with other energy subsidies).
yes. I think the california rail system is also an overpriced underperforming system. It will never be finished (it will never make it all the way to SF), so it will never be completely useful, and the history of all large projects like this show huge cost overruns. Do you really want to allocate capital like that for something that's unlikely to be finished, nor likely to achieve its actual goal? When we could spend money to make cars more efficient (electric batteries with central generation) and reduce our total energy consumption, fairly cheaply?
I agree with your other points. But the AIG bailout not only saved filthy rich bankers, but saved the entire world economy.
Private pensions would have been wiped out, who own 95% of the stockmarket. The pensions were chasing gains and were quite happy while the returns were fat. The government cannot afford its own pension promises anyway and would have been wiped out.
Big science projects are paid in installments over many years. If the financial system collapses, for sure the big projects will be mothballed immediately.
The point is that governments have this type of money. In spades. Pretending we're not properly prioritizing one type of research or another is a farce - if you can convince anyone to invest in nuclear fission reactors, whether they do that won't really be affected by how much fusion research is happening. That's not why we don't have nuclear fission right now, but shutting down nuclear fusion might just ensure we never get either.
You think governments have this type of money in spades?
The US does not. We're in a major debt crises and have issues funding our fundamental operations; beyond the immediate government funding, we have severe problems with spending (IE, we can't pay the interest on our spending with our tax revenues).
Are you really this pedantic? I used an 'and' and a 'semicolon'. If you parse my sentence you will see I made no such implication directly that the current issue with the House of Immature Children is the cause of, or effect of, our debt crisis. Anyway, do you disagree with Wikipedia? """"Debt crisis is the general term for a proliferation of massive public debt relative to tax revenues, especially in reference to Latin American countries during the 1980s, and the United States and the European Union since the mid-2000s."""
"We're in a major debt crises and have issues funding our fundamental operations; beyond the immediate government funding, we have severe problems with spending "
Like many wikipedia articles in areas where there has been intense political propaganda efforts, the one on debt crisis  which you quote but do not cite engages is badly confused and influenced by propaganda -- and the four sources for the sentence you quote are an opinion piece, a blog post, an infographic, and broken link, none of which (except, perhaps, the content which once was at the broken link) directly support the thesis of the sentence. By Wikipedia's own standards, that article is bad.
And while you say that the current issue with the House isn't the cause or the effect of the debt crisis, the only US examples of so-called "debt crisis" linked from the Wikipedia article you lean on are the current and 2011 debt ceiling debates. So, its ironic that you ask "Anyway, do you disagree with Wikipedia?"
Anyhow, the defining characteristic of a debt crisis is default risk, not debt vs. tax revenue (debt service cost vs. GDP -- not total debt vs. GDP -- is probably the best "easy" numerical measure for fundamental default risk, but the real source of default risk in the US is political games like the ones you discount from the "House of Immature Children".)
> So, you're saying we're at a default risk not a debt crisis.
No, I'm saying that debt crisis is equivalent to default risk, and that the only significant way in which we are in either (the two being the same) has nothing to do with economic or fiscal fundamentals -- the resources to service the debt are readily available -- and solely to do with the present political shenanigans.
actually, big projects often have to be maintained because otherwise, huge numbers of highly paid scientists are let go. It's easier to shut down short-term, small-employee projects (like annually funded grants).
Well, I think you're underestimating oil and gas, but that's another argument.
Nobody's denying fusion is a great energy source.
But your claim that there is a payoff for low risk is unsupported by reality: we have only ever invested money in fusion (risk) and have never received any useful power for it (payoff). At this time, the risk/payoff ratio is infinite.
> we have only ever invested money in fusion (risk) and have never received any useful power for it (payoff)
Now you're just being silly. You can say this about ANY research project. Better scrap the Manhattan Project before it finishes, we haven't seen any results. You're saying it's impossible to build a fusion power plant because we don't have one now?
Nevermind the fact that magnetic fusion research has been grossly underfunded for over 30 years.
By the time the Manhattan Project started, we had already made signficant progress. Such as Fermi's pile. We knew how to seperate uranium. The math was all there; it was mostly a matter of decision-making and execution.
I never said anything was impossible. But fusion is still solidly an unknown unknown: we have no credible path at this time to even plan a workable test reactor. I'm saying, for the amount invested, the results achieved, and the potential payoff, our money is best spent elsewhere are more boring and conventional things.
That's just not true, and skirts the line of the being a deliberate lie. Fusion has occured in tokamaks, at an efficiency of 65%. ITER, as currently designed and being built, is expected to make a 10-fold return on energy input.
We are precisely at the position of "the math is all there" with regard to magnetic confinement. And so is the engineering and the funding (barely). It's going to happen, hopefully not before it's too late.
I should mention that in the past, I worked on the software backing research tokamaks.
ITER is a research project. It is not designed to produce electricity. It was originally projected to cost over 5 billion pounds to build and has now tripled, but is years from completion. Even if it was successful beyond its wildest dreams, the best result would be that we'd have to spend another $50B to build a plant, and since that one would be the first and only, we wouldn't get much power from it.
A hundred billion dollars is a lot of money. The annual budget of NIH is ~$30B and it funds a high fraction of the most advanced biological research worldwide. The DOE also has a budget $30B and funds (along with this project) many other projects including alternative energies, advanced computing and the Human Genome Project. NSF is (only!) another $7B, leaving around $33B addition funds for a diverse research portfolio.
This kind of expenditure is required to maintain US at or near the highest level of technology research in the world (other countries can compete with us in many ways, such as lower labor costs). There are things that can only be physically done in a single location in the US and nowhere else in the world- and it will be that way until on of the postdocs goes back to Europe or Asia and replicates the result in their lab.
So yes, $100B is a lot of money- and that's the kind of investment that often ends up benefiting the US (and other) economy. The fusion research-- to the extent that what NIF is doing is translatable to other domains-- doesn't really meet this sniff test.
Yes, and it's not certain that we will ever be able to have a stable, viable fusion plant. There are even somewhat recent works showing that certain kinds of confinement which were being tried in the early days are not theoretically viable.
We're still a few orders of magnitude away from energy, and we're probably quite a few reactor design innovations and material breakthroughs away, I'd say, as far as my layman knowledge goes.
That may well be true. But it also may well be true that absolutely none of the methods currently conceived are at all viable.
If we are spending money fast because we think we're in a sprint to achieving viable fusion power, but it turns out we're actually in a marathon, then it's quite possible we're wasting vast amounts of money unnecessarily. That's not an argument to stop funding fusion research, but it is an argument to moderate it and possibly even broaden the types of research being done to more projects but at overall lower funding rates.
When has coal power ever created the sort of crisis that occurred in Fukushima?
Because of fission reactors, the safety of entire regions of the world is dependent on a consistent power supply and lack of human error, something that clearly we can not rely on. In the event of a cosmic emp or major meteor strike, these power plants are at risk.
This is not really calculable, nor is the number of people killed by coal calculable in any reasonable manner. However the impact of nuclear catastrophe is far far more insidious and lasting, particularly when caused by a global calamity that causes meltdown in many plants.
I suppose fear of a 9+ earthquake followed by a massive tsunami off the coast of Fukushima was considered unreasonable.
We are on a blob of rock hurtling through an unknown cosmos. Calamities are very possible. It is deeply irresponible to create projects that will make large parts of the earth uninhabitable should the power grid fail for a long period or human stewardship go on hiatus. Meteor strikes, emps, plagues, terrorist attacks, economic collapse,and other disasters are well within the realm of possibility
Because of Greenpeace's history (the death of an anti-nuclear testing activist and the destruction of one of their ships at the hands of French special forces in 1985 ), and their current anti-nuclear stance , I don't think they would back that position.
They would back the position that wind turbines are safer than coal, so you're not wrong that they might be biased to imply coal is more dangerous than it is.
> It's just that the western nations decided to go stupid about fission
I have many highly intelligent friends who subscribe to your view. Unfortunately they forget that fission is only as safe as the weakest link, and the consequences of failure are severe, especially when near a water source.
That appears to be an accident handling the output of an isotope manufacturing reactor, not anything involving a power reactor.
> I would argue that there needs to be a significant improvement before fission is publicly acceptable.
I would argue that fission is already held to a standard well above that of almost any human endeavor, even those with roughly similar risk profiles (compare almost any form of mining, or the shipping of hazardous cargo by rail)
If your technology has reached the point where human error and negligence is the primary source of risk, you've succeeded. That is the goal of producing all reliable technologies. Of course, at that point, your system design primarily exists to prevent human errors from causing harm (for example, yesterday there was a news report that a Fukushima cleanup worker managed to engage a pump that would dump a bunch of radioactive water in the wrong place. He was saved by a backup system.")
> If your technology has reached the point where human error and negligence is the primary source of risk, you've succeeded.
If all it takes is one earthquake or distracted technician to potentially vaporize a suburb, then your tech has not yet succeeded.
Safety regimes to minimize human error are great, but time and time again they are bypassed due to greed, negligence and idiocy. The Fukushima backup system you mention is exactly the kind of thing I'm talking about - but they need to be comprehensive, redundant, and responsibly audited (the part that scares me the most).
The problem with fission is that its disasters are very hard to clean up. Mine collapses or fossil fuel fires are brief events. Most chemical spills either break down, are neutralised, or diluted to harmlessness (although far from all, see the various Superfund sites).
Fission disasters tend to emit nasty metals that are bioaccumulative, don't break down in a short timeframe and may be airborne or waterborne.
(Having said that, it looks like Bhopal was worse than Chernobyl for harm, and the chemical industry is still in business)
It wasn't "the Japanese" that didn't do it safely at Fukushima. It was GE
Was GE responsible for the siting of the backup diesel generators and switchgear behind a seawall that was overcome by the tsunami? That was the root cause of all the problems; it wasn't anything to do with the reactor design itself, other than the fact that it needed active cooling after shutdown (which PWR designs from the same time period also did).
(That said, I agree the BWR design is outdated and modern designs with passive safety features are much, much better.)
But the backup generators could be moved (and were in similar US installations). Raising the backup generators to protect them against inundation was recommended, but the delay there is pure sclerotic TEPCO.
I still don't understand why Chinooks weren't in the air toward Fukushima within 30 minutes of the tsunami with some generators sling-loaded under them (and refueling, etc.). Either from the JSDF or US Military.
As I understand it, the issue was that not just the generators but the switchgear connecting them to the plant's power grid was behind the seawall that was overcome by the tsunami. So bringing in other generators wouldn't have helped, because there was no way to plug them into the grid.
You can usually get portable switchgear, and it was mainly pumps, which should be fairly tolerant even with an emergency cabling job. I'd really consider this a contingency plants should be ready for; if I operated at tepco scale I would probably have actual helicopters under contract (or priority with the military) and strategic equipment in hardened shelters at my plants, just to be able to move it to other plants.
From what I've read there was a leadership vacuum for the initial crisis at tepco. I'm actually going to try to visit as close as I can get to the plant tomorrow or Saturday (in Sendai right now)
Thought experiment... begin the experiment with the assumption that the "system" can produce and sustain 2000 brains capable of safely operating a nuke plant. You can build 1000 slightly less safe but bigger plants, or 10000 smaller slightly safer small plants, but you've still only got 2000 usable nuke level brains to staff them. Either half the big brains are going to be unemployed, or 80% of the only very slightly safer plants are going to be run by morons. Don't amuse me with deus ex machina about automation; I'll counter by changing my thought experiment to only 2000 nuke-capable sysadmins are available or whatever.
I think you're unfairly dismissing magnetic confinment, the limited supply of fissionable material, and the unprecedented issues that accompany the management of dangerous material (or anything for that matter) for 10,000 years.
Fusion research is worthwhile. Of course it's not the only path to sustainable energy, but it's one of the most promising.
This is awesome news - I visited the NIF a few months ago and they seemed a little downtrodden when asked about results.
I bet they're all cocky now!
They also told us that the lasers they use, if built with modern tech today, would actually only be the size of a 40' cargo container (as opposed to like 100K sq-ft building), and cost like 1000x less. Pretty epic...
If we invested in fusion power like we did water power less than a century ago, I can only imagine the possibilities...
This is so cool. I almost can't believe that such incremental tech development would be applicable to fusion research. But there you have it. Now we just have to throw a few more billions at the problem. If the costs really have come down this dramatically, it is very promising indeed.
I giggled at this, but a serious question for those in the know: is computer simulation a significant factor in the development of fusion experiments, and can it be distributed (a la SETI@Home, Folding@Home, et al)?
Too bad shovel ready money wasn't spent on more long lasting impact projects like this. Where has all that money gone from the drawn down of the wars? Where is the investments going in alternative energy?
Well that last is easy, groups who contribute politically, hence solar and wind. They receive far too much in ways of support whereas hard science projects like this with incredible payoffs are pushed off because they don't payback in political contributions.
There are a myriad of ways to finance such an ambitious project through clean energy taxes but unfortunately they suffer the changing whims and needs of politicians. Yeah certain industries would go by the way side with this type of power available but many of them have big money invested in energy creation and distribution many would simply shift some columns in their spreadsheets and still balance out.
Considering the deficits the US runs, just in a day or two of deficit spending is more than what this facility budgets for this research.
It seems to me that the more plausible explanation is something like "Solar and wind power are proven to work, whereas fusion research is still trying to find something that even looks like it might work." Even after this "milestone," fusion still requires more energy from us than it gives back.
Basically, the funding is going toward practical technologies, in the sense that you can actually put them into practice today and reap rewards.
Exactly. If one were to give a hundred billion to fusion research, how much faster would we get results? Would we be funding needs that are critical to help the science into its next phase, or just be funding the N next-best ideas that previously were set aside?
For a community that has such high praise for bootstrapping businesses, there's a surprising love of throwing money at unproven scientific research avenues. We're in this for the long haul, if we get fusion in twice the time for half the money, we'll still have fusion.
If we have fusion in 30 years instead of 60 years, we have colonization of the solar system happening in our lifetimes. We'd have the economic rationale for removing all military engagement in the middle east. We have 30*380 billion on petro cost savings here in the U.S: http://www.fuelfreedom.org/the-real-foreign-oil-problem/oil-... God knows how much CO2 we remove from the atmosphere. 100 billion sounds cheap to me.
It comes down to energy. Fusion would dramatically decrease energy costs, which would in turn drive costs of almost everything else down and allow us to do things that had been only a dream before such as fusion propulsion. So run your cost/benefit analysis again with that in mind.
This is even before we get into ecology - most modern energy sources are highly pollutant (oil/coal). How much better off will planetary ecology will be if fusion comes online 10 years earlier?
Excellent points. I certainly know that with much more funding, there could be much more efficient research, though I wonder where the limit is. How many more world-class physicists are there to put on this problem? Are the other theories really "next-best" ideas, or are they simply unexplored avenues, or perhaps more appealing with recent technology advances, but we've already committed to the current paths.
There's a second factor, which is that certain specific fusion designs are heavily over-funded, in a way that doesn't necessarily reflect their practicality. A case in point: NIF isn't even supposed to be the basis for a reactor design. ITER is, but it's still a fair way off and even assuming it lights it'll be 2 more generations before possible industrial application.
The problem is that these approaches can't work without enormous budgets, and while they produce lots and lots of very interesting science, they hoover up talent and funding resources that might be better spent exploring other avenues.
In situations of depressed aggregate demand, it really doesn't matter what you spend the money on -- you could even bury it in the ground like gold and "pay" people to dig it up. Whatever you fund is essentially a free lunch.
You don't want to wait months though, which is why "shovel-ready" matters and this sort of research loses out to other sorts of projects. That said, less guns and more fusion would be nice.
Well, I get the impression billions spent on fusion research doesn't really employ that many more people, it just spends more on expensive resources. Maybe that has a knock-on effect, but using the money on more labour-intensive work (skilled or otherwise) is still probably a better use. Such as, building and installing a heck of a lot of cheap solar panels.
Exactly. Say you go to a fusion laboratory that employs the top ten fusion scientists in the world, with a current budget of a few hundred million dollars, and you say to them "I've got 2 billion here in additional funding for you. You need to spend it in the next few months because I'm trying to stimulate an economy right here."
Does that hire you forty more world class fusion scientists? Are there forty more world class fusion scientists?
Are those scientists going to go "Fantastic! We'll just order up three more of these experimental fusion reactor rigs, here are the plans, we'll start hiring engineers and buying up electronic components from the local silicon fab down the road driving all that money into economic productivity for you!" - unlikely. They already have one experimental fusion reactor. Three more the same aren't going to get them anywhere any faster. They won't know what the next one should look like until they've finished getting results out of this one.
Maybe they just spend it all on repainting the lab, buying new office furniture, getting in catered lunches - you might see a marginal improvement in fusion research productivity, and certainly there's a local economic stimulus which was the idea there, but you're not getting the valuable long term capital value of, say, using the money to rebuild a few bridges.
Even if it turns out your ten fusion scientists have a ready proposal to build the next generation experimental fusion reactor, plans are drawn up and costed, and all they need is you to hand over that check for two billion, if they then go ahead and order the parts from China, subcontract a bunch of German engineering firms to build the cooling systems, and get the heavy steelwork welded together in Korea, your stimulus effect just disappeared overseas.
Of course, if that experimental fusion reactor turns out to be THE ONE that creates viable economic fusion power, maybe that's worth the 2 billion. But that's not stimulus spending, that's gambling on an investment.
I can answer to one of your question, there is forty more world class fusion scientists (especially young one with new ideas). They usually are kicked out of research because of no money and send into the job market into financial and other jobs.
So this could help everyone if these more qualified people could stay in research and not take a spot in a job that can be filled by a less skilled person.
Also, high-level research projects push the state-of-the-art in the private sector. When CERN order up a new type of magnet, that contract goes to a company who does magnet winding who in turn usually revamp their entire process as part of the collaboration to achieve the desired field strength, homogeneity and size.
Suddenly, there's now a company which can do that - and is eager to sell the service on since they now have the capability. Not only do subsequent magnets get cheaper, but things which weren't feasible due to their smaller scale but precise requirements suddenly become possible.
> Soon after, the $3.5bn facility shifted focus, cutting the amount of time spent on fusion versus nuclear weapons research - which was part of the lab's original mission.
That was so disappointing to read. Don't we have enough nuclear weapons to blow up the planet 10x over already? Why would that still be the focus instead of inventing new sources of virtually unlimited energy??
NIF's nuclear weapons research was in support of maintaining the current arsenal of nuclear weapons. Most notably, it helps solve the question of how to ensure current stockpiles are reliable without actually detonating a nuke (since that's a huge no-no). It also kept a generation of nuclear physicists and engineers in employment, and also providing a method by which to test future weapons designs.
So really, the answer is, it doesn't matter how many nukes we have right now, NIF was created (in part) to ensure that the capability to blow up the world could be maintained.
As depressing as it is, it's pretty much how it goes with nearly all defense technology spin offs. Why the hell did we spend all that time working on rockets that purposefully crashed back onto the Earth (on top of people!) instead of doing something like getting communications satellites up, or getting to the moon.
They would never have gotten funding for it. No way in hell.
Plus, war made sure that they put getting working reactors first. Not interesting science. A critique in fusion research is that nice, flexible and very accurate/complex apparatus is given preference over quick-iteration and fast experimentation. War made sure that the researchers went straight for the goal (and there is the fact that the Germans and the Americans both knew that there was probably a way to tickle the uranium reactors to result in Hiroshima. This caused the Germans to be careful, although even their experiments would today be considered absurdly dangerous. But the early American experiments were bat-shit insane). So thought hard about every step, and constantly fucked up (someone left 2kg uranium in a bath of water + cadmium by mistake, then walked out for the night : the first meltdown. They didn't figure out what had happened until years later. Someone inserted a steel rod into a barely sub-critical reactor submerged in water. No-one left the building alive (due to the water exploding as steam violently enough to bring the roof down, not due to a nuclear explosion). None of this would not have happened in peace time.
Producing net-energy is a different story than collecting net-energy. NIF just blows up a canister with a pellet and takes forever to reload a shot. On the other hand, an at-least-as-promising technology like Dense Plasma Focus (http://en.wikipedia.org/wiki/Dense_plasma_focus) is still receiving minuscule funding. I'm hoping to be able to meaningfully support DPF and other fusion technologies soon. DPF has my attention so far because of its scaling characteristics.
What does producing energy mean if not collecting it? The conservation of Energy tells us that we are not actually producing any energy, rather we are converting energy from the mass of the reactants (hopefully into electricity). If you consider the energy converted from mass to be produced, then it is impossible to do nuclear fusion without producing net-energy (as long as you stay on this side of lead).
Yes but in the real world there are inefficiencies. We can't even transfer energy efficiently enough into the reaction yet to get a net energy gain from the fusion reaction. And once you reach that milestone, you still have to make transfer of the yield energy into useful work efficient enough to make the whole thing productive. Efficient energy transfer is hard!
8 kJ out from 1.7 MJ (1700 kJ) in. At the end of the month they were able to get 14 kJ. I believe they are referring to the energy released within the hohlraum.
Also, if you are interested there are privately funded companies doing this, General Fusion (http://www.generalfusion.com/) and TriAlpha Energy (secretive and funded by the Russian govt., but in California). The VC fund I work for has invested in GF and obviously we think there is promise :)
I remember this being announced a while back, but I didn't understand why it was significant if the energy in was less than the energy out. This article helps to clear it up.
The missing piece was that I didn't understand how this reactor works. I thought they just blasted a lot of lasers directly at the hydrogen isotopes. Instead, it seems like that use the lasers to shoot something else, which then creates a lot of x-rays which actually start the reaction.
The significant thing here is that the energy produced is greater than the amount of energy coming in from the X-rays, but not the lasers which power those X-rays.
Is that correct? (Not surprisingly, I'm not a physicist!)
Lasers gets fired at the inner surface of a cylinder (called a hohlraum). The laser heat the surface of the hohlraum which then emits x-rays. The x-rays then deposit energy on the surface of a spherical capsule (this is the "energy absorbed" number).
You start with ~2,000 kJ of laser energy, but some of that gets "backscattered" and never makes it into the hohlraum. Other energy is spent heating the walls of the cylinder. Additionally, some of the x-rays leak out of the hohlraum and do not drive the capsule implosion. After all these losses are taken into account, only about 15 kJ is absorbed in the capsule.
The showers continue. Budget pressures have NIF down to shooting three days a week. To make matters worse, DoE has decided to cut funding for high energy density physics, ie the field covering the physics needed to make NIF ignite and turn laser fusion into a viable electricity source.
This bit of the fusion research is mostly aimed at making small-scale experiments on weaponised nuclear fusion without having to set off nuclear bombs every other day. As a long-term energy source, it is assumed to be rather unusable.
tl;dr - Another article about the US nuclear weapons research facility at Lawrence Livermore, AKA NIF, and its sideshow 57th priority.
On a related note. It's been really sad to see the US slowly lose its edge in plasma based fusion tech, specifically tokamaks, which seem to be the only credible long-term method of sustaining a fusion reaction for power-plant purposes.
Just a couple quibbles from a former NIF scientist.
NIF is much higher on the priority list than that. LLNL is already in the doghouse due to NIF failing to ignite on schedule. As goes the 5 giga-buck NIF, so goes LLNL (and the management knows it).
Technologically, I'd put inertial and magnetic fusion about the same place. Even if the physics works, neither has a chamber first wall material that can stand up to the huge neutron loads that a power plant will create. Economically, both are hosed. Fusion wants to be big. Most reactor designs are for >1,000 MW. Electric companies are mostly interested in plants in the 50-400 MW range.
Don't get me wrong I think that LLNL and the NIF are doing great work. I just think the work is more weapons based and less energy than we are all lead to believe in the press. and I meant that the Fusion Reactor business is a low priority compared to testing the nuclear stockpile, etc., not NIF in general.
The chamber first wall material keeps me up at night. I think money spent in that area would be really well spent and have many many uses, both in places where you need neutron shielding, and to a lesser extent, protection from heat. When I think of really big, fun, 1960s style energy projects, the only other 'credible' one seems to be laser pro-pulsed power-satellties. Now there is where we might take a lot of the laser tech from the NIF!
I haven't seen enough details to have a good opinion, but I'm skeptical (but would happily eat my hat if it works). We abandoned mirror fusion in the 80's because confinement is really hard if you have open field lines . I do think high beta confinement concepts (, ) are very cool and could drive power plants small enough for power companies to be interested.
On the article itself: why did it take 6 paragraphs of text to finally mention what the milestone was? I hate this style of article writing, and it's usually a good bullshit-signal for any story. That's a shame, because it looks like this is the Real Thing (in a small way).
I can think of two off the top of my head. Low temperature plasma processing (used in semiconductor fab) got it start as the-mess-created-when-a-plasma-confinement-experiment-fails. Adaptive optics were developed for high intensity lasers and are now used in telescopes.
Great, but I don't see how these extremely expensive nuclear fusion projects will ever beat price/kW of coal without heavy subsidies. My bet is that dense plasma focus is the future of cheap, clean power. Only time will tell though.
Yeah. My comment has nothing to do with finite or infinite amount of any energy technology but rather that my original comment applies to fusion competing with the price of any existing energy technology such as coal, oil, wind, solar, etc.
Much of the cost of these projects likely results from the research nature of them. Once we get a viable model, we will likely be able to produce a cheaper plant because 1) we know what we are doing, and 2) now that we know it works, we can build it bigger.
"the amount of energy released through the fusion reaction exceeded the amount of energy being absorbed by the fuel" means it is at least 4 orders of magnitude away from being useful.
Since we still depends on nuclear power for decades to come, it is much cost effective to invest safer and cleaner nuclear fission reactors. The kind of fast reactor that can burn down nuclear waste so we don't need to build nuclear waste storage system, which nobody knows how to build anyway. That would give us power supply for several centuries (along with renewable energy). Too many countries wasted too much money on fusion reactors for decades, while we are still running nuclear reactors designed/built more than 30 years ago. Just wrong priority order.
What would be the impact of a 'fusion economy', assuming realistic evolution of the technology for the purposes of commercialisation? (i.e. I'm assuming "Mr Fusion" powerplants on top of one's DeLorean aren't ever going to be practically feasible).
What would such a world look like? Does it promote world peace; through greater energy security for nations, would less reliance on fossil fuels for baseload electricity generation have a significant impact on price of air travel/sea cargo?
Fusion changes everything. Cheaper electricity makes transporting materials basically free. And projects like desalination become possible = no one remotely close to an ocean ever lacks clean drinking water again.
> the amount of energy released through the fusion reaction exceeded the amount of energy being absorbed by the fuel
So what? What if the fueld didn't absorb anything? What do they mean by "absorb", anyway? This article is lacking in details, peer-reviewed literature, or even the names of scientists willing to stake their reputation on this claim.
The development of fission technology seems like a fine case to consider. I doubt we are more paranoid and militaristic now than we were during the height of the cold war. Maybe we are, but I really don't see it.
The University of Sydney have continued research of polywell reactors. I'm not terribly across their progress but I try to read papers whenever they're released though I haven't read any of their 2013 papers. It seems the last few years (2010-2012 at least) have focused around minimising point- and line-cusp losses.
It's all classified. It may be useful as an economical neutron source, or as a way to keep a contractor in business, but there's been nothing published about its efficacy as a reactor for sustained fusion power.
If there's one thing we've learned over the past few years, it's that "classified" knowledge, with no critical eyes to keep it honest, is often wrong. Science works, not because it's done in secret.