On the plus side I offer the following report from the USGS in 2016 : https://pubs.usgs.gov/fs/1997/fs163-97/FS-163-97.html which talks about the levels of uranium and thorium in fly ash (0 - 4 ppm).
I've got memories of an experiment we did in undergraduate physics labs, which involved a directional array of Geiger tubes. Pointing them at the view through the window gave a low count, but pointing them at the concrete structure of the building gave an elevated count. My understanding at the time was that it was due to naturally occurring uranium and thorium in the raw materials of the concrete.
Given that fly ash is a common component of concrete , I wonder how much of the elevated radiation from concrete can be put down to the concentrating effects of coal combustion that you mentioned in one of your comments?
Combustion concentrates radioactivity by a factor of 5x to 10x . Fly ash can replace 25-50% of cement . A typical concrete is 16% cement (1 part cement, 2 parts sand and 3 parts gravel) .
Hence a concrete might be 4%-8% fly ash. If the ratio of fly ash is 'A', the concentrating caused by combustion is 'C' and the background radiation of other concrete ingredients is uniformly taken to be 'B', then the radiation count of concrete without fly ash will be:
and the radiation count of concrete with fly ash will be
A.C.B + (1-A).B
Plugging in values for A (0.04 to 0.08) and C (5 to 10), we find that fly ash increases the radiation from concrete by a factor of 20% to 70%. That strikes me as a significant proportion of the radiation from concrete.
Typical banana dose: 0.1 μSv per banana .
Typical concrete building dose: 100μSv per year 
ie. 1 year in a concrete building = eating 3 bananas a day.
I'm not claiming that fly ash laden concrete is going to kill anyone, but I think it's interesting that (given my assumptions) the concentrating effects of coal combustion are causing a decent proportion of concrete's radioactivity (even if the total is still negligible).
That's the thing though, these low 'safe' levels of radiation do actually kill people. Every now and then somewhere, someone will be hit by a radioactive emission from Fly Ash that happens to hit one of their chromosomes and trigger cancer.
If Fly Ash increases cancer cases in a year by x% then we can be pretty sure that x% of cancer patients were victims of Fly Ash, it's just that x is actually fairly small and we just don't know which ones they are. But they still have cancer.
The average U content of the upper continental crust is between 2 and 3 ppm. Individual rocks will range lower and higher than that. If fly ash is only 4 ppm U, you have more to worry about from living on granitic rocks, or near a swamp than you do from fly ash.
More context: if you live on the Chattanooga shale it has an average U content of 30-120 ppm. The extensive Alum Shale (runs the length of Sweden) has U contents of 100-300 ppm. Granites will have low-mid 10s of ppm U and Th.
The report  linked in a comment above makes the comment Fly ash is commonly used as an additive to concrete building products, but the radioactivity of typical fly ash is not significantly different from that of more conventional concrete additives or other build-ing materials such as granite or red brick. One extreme calculation that assumed high proportions of fly-ash-rich concrete in a residence suggested a dose enhancement, compared to normal concrete, of 3 percent of the natural environmental radiation.
I see no decay chain leading to Radon. https://en.wikipedia.org/wiki/Isotopes_of_iridium
Airplanes and cars aren't designed with a certain death rate in mind, but it's good to compare the actual bad outcomes.
It's true that coal plants are "designed" to emit a characteristic amount of radioactive material, but they also don't suffer from catastrophic or significant accidental releases so it's appropriate to compare actual historical outcomes. The failure rate of nuclear power is an important part of the consideration. (and it still ends up on top)
We've had a countable number of catastrophes and are getting better at building safer reactors (G IV reactors are modulated by physics).
Regardless, Chernobyl is estimated to be currently claiming 4000 lives per year worldwide, while coal power plants are claiming 7500 lives per year in America alone. Nuclear meltdowns are very scary events. Merely running a coal power plant is a far scarier.
In 2008, 1.1 billion gallons of coal fly ash slurry was released from Kingston Fossil Plant in Tennessee.
Not entirely true. The NHTSA can promulgate safety regulations, but if the cost of implementing exceeds the societal cost of not implementing then the regulations don't need to be followed.
Deal with Uranium and thorium is normal geological processes aren't very efficient at concentrating either. Which means commercially exploitable ores are rare and the element remains widely distributed.
 It's a bit more tricky since part of the radioactivity from uranium and thorium is due to decay products, some of which probably escape out the stacks. (See radon).
Why would you suggest we prevent using all appropriate options at our disposal? Why not push for using a different type of fuel instead?
I didn't grow up during the nuclear scare times. Fukushima wasn't great, but it wasn't so horrific either. If it gets us off coal and natural gas then I'm down.
Yeah coal mining sucks.
That's what they say after every accident too. "This time, with the new designs, it's different".
No, but lying and having experts being, ahem flexible, about the expected safety is very easy, and par for the course when selling multi-billion dollar projects...
It's also very easy to ignore "black swan" event cases, and the potential impact to millions of lives, just because you think you've covered everything there is to cover.
But that's not the case. We have real-world evidence that we can use to calibrate our expectations against reality. The fact is that, even if you count Chernobyl and Fukushima, our existing reactors are safer than fossil fuels. And I think that you would have an exceedingly difficult time arguing that newly-built reactors would be less safe than existing reactors.
Plus, previous installations had fewer potential non-design-related issues, such as terrorism, which (in today's nihilistic "more possible damage, including innocents and even myself" way) wasn't as much a thing in the 60s and 70s.
He already answered that: because not all options are appropriate.
No one I know is opposed to renewable energy, but advocates really do everybody a disservice when they try to argue that an intermittent power source without storage is a reasonable replacement for base load power. This comic illustrates the problem: http://www.smbc-comics.com/comic/capacity
>... How can you guarantee that a certain place is safe to store millions of tons of radioactive waste for thousands of years if we can't even guarantee what will happen tomorrow?
Millions of tons? Where are you getting that number from? Right now nuclear waste can and should be recycled which would reduce the amount of waste: https://en.wikipedia.org/wiki/Radioactive_waste
Soon it will be possible to use most of the waste as fuel:
"...Fast reactors can "burn" long lasting nuclear transuranic waste (TRU) waste components (actinides: reactor-grade plutonium and minor actinides), turning liabilities into assets. Another major waste component, fission products (FP), would stabilize at a lower level of radioactivity than the original natural uranium ore it was attained from in two to four centuries, rather than tens of thousands of years"
The worry people have about nuclear waste is greatly overblown. The amounts generated are manageable and in a relatively short amount of time we can use most of this "waste" to generate electricity.
>...We need to stop this nuclear madness!
NASA has estimated that using nuclear power has saved an estimated 1.8 million lives that would have been lost if the power has been replaced by fossil fuels: https://climate.nasa.gov/news/903/coal-and-gas-are-far-more-...
As someone in a previous discussion pointed out, the historical record for deaths from nuclear power have been very low:
Energy Source Mortality Rate (deaths/trillionkWhr)
Coal – U.S. 10,000 (32% U.S. electricity)
Natural Gas 4,000 (22% global electricity)
Solar (rooftop) 440 (< 1% global electricity)
Wind 150 (2% global electricity)
Nuclear – U.S. 0.1 (19% U.S. electricity)
> No one I know is opposed to renewable energy, but advocates really do everybody a disservice when they try to argue that an intermittent power source without storage is a reasonable replacement for base load power. This comic illustrates the problem: http://www.smbc-comics.com/comic/capacity
The comic is a bit disingenious as it implies that all renewables are intermittent.
But many renewable energy sources are base load as well, e.g. hydro or wind. Yes, they have variations, but so does the load -- from the perspective of power grid management there's nothing new.
In fact this is part of the problem: renewables and nuclear (or coal) are competing for base load. If we build a nuclear plant, we need to run it for 50 years for the investment to make sense. This means that it will economically and politically impede the installation of e.g. wind power for 50 years.
There are some locations with very reliable constant wind (though usually only for part of the year) but that's not the norm. Offshore wind does better, but is still far from base load.
The German government's scientific service, for one, disagrees with you.
It being a German government publication it's in German unfortunately.
As far as I can tell from skimming it your impression is correct.
So I apparently misremembered the article, and my claim about base load a few posts above may be incorrect after all. Sorry, rsynnott.
However you're right that Germany was a net importer from (among others) France:
Which is surprising as I remember reading that France has tremendous difficulties satisfying their electricity demand, especially in winter (lots of electric heating apparently).
No, I don't think it does that.
>...But many renewable energy sources are base load as well, e.g. hydro or wind.
As user rsynnott said "Wind most certainly isn't base load." While hydro is base load power, only a few countries like China are considering building more hydro plants. We aren't going to be able to use hydro as a means to get off of burning fossil fuels.
Ultimately, however, the core problem may be that such new reactors don't eliminate the nuclear waste that has piled up so much as transmute it. Even with a fleet of such fast reactors, nations would nonetheless require an ultimate home for radioactive waste, one reason that a 2010 M.I.T. report on spent nuclear fuel dismissed such fast reactors. Or, as Cochran puts it: "If you want to get rid of milk, don't feed it to cows."
"Do the math! 1.1 additional GT out of 36 GT emitted is only a 3% difference. This 3% value is not a typographical error. Worldwide, all those nukes made only a 3% dent in yearly CO2 production. Put another way, each of the 438 individual nuclear plants contribute less than seven thousandths of one percent to CO2 reduction. That’s hardly enough to justify claims that keeping your old local nuke running is necessary to prevent the sea from rising."
Hanford was started in the Manhattan project to produce plutonium and during the cold war produced plutonium for tens of thousands of nuclear warheads. All of these government weapons plants were quickly started with inadequate policies for handling the material.
>...Ultimately, however, the core problem may be that such new reactors don't eliminate the nuclear waste that has piled up so much as transmute it.
A 4th gen design like the IFR would allow you to end with a much smaller volume of waste that would only be dangerous for a few centuries.
In terms of natural gas, the numbers given are probably out of date - CO2 emissions from a natural gas plant are lower than a coal plant, but that doesn't account for the methane emissions that come with fracking and distributing the methane.
>...Back in August, a NOAA-led study measured a stunning 6% to 12% methane leakage over one of the country’s largest gas fields — which would gut the climate benefits of switching from coal to gas. We’ve known for a long time that methane is a far more potent greenhouse gas than carbon dioxide (CO2), which is released when any hydrocarbon, like natural gas, is burned. But the IPCC’s latest report, released Monday (big PDF here), reports that methane is 34 times stronger a heat-trapping gas than CO2 over a 100-year time scale, so its global-warming potential (GWP) is 34. That is a nearly 40% increase from the IPCC’s previous estimate of 25. ...The IPCC reports that, over a 20-year time frame, methane has a global warming potential of 86 compared to CO2, up from its previous estimate of 72. Given that we are approaching real, irreversible tipping points in the climate system, climate studies should, at the very least, include analyses that use this 20-year time horizon. Finally, it bears repeating that natural gas from even the best fracked wells is still a climate-destroying fossil fuel. If we are to avoid catastrophic warming, our natural gas consumption has to peak sometime in the next 10 to 15 years, according to studies by both the Center for American Progress and the Union of Concerned Scientists.
As we use more and more natural gas, we can expect more and more methane disasters like the leak from Aliso Canyon in CA which was the largest methane leak in US history. This released over 100,000 tons of methane into the atmosphere and required 11,000 residents to be evacuated.
It makes me wonder how much of each element you could extract by "mining" fly ash dump sites.
>Average total REE content (defined as the sum of the lanthanides, yttrium, and scandium) for ashes derived from Appalachian sources was 591 mg kg–1 and significantly greater than in ashes from Illinois and Powder River basin coals (403 and 337 mg kg–1, respectively). The fraction of critical REEs (Nd, Eu, Tb, Dy, Y, and Er) in the fly ashes was 34–38% of the total and considerably higher than in conventional ores (typically less than 15%).
First look at tables 2 and 3. The raw radionuclide release from the reactors, measured in curies, are far greater than those from the coal plant. Both reactor types release thousands of curies of xenon isotopes and hundreds of curies of krypton isotopes. The PWR also releases thousands of curies in the form of tritium. The coal plant releases less than 2 curies of all radionuclides combined.
Next look at table 4. The maximum whole body dose commitment, e.g. the dose that somebody might be exposed to if they lived 500 meters from the sources, just beyond the plant boundary, is highest for the boiling water reactor (4.6 mrem/year), then coal (1.9), then the pressurized water reactor (1.8).
But if you look at table 5, average population dose within an 88.5 km radius, you see that the population dose commitment over the whole region is lower for both pressurized and boiling water reactors (given the assumption that 100% of the food people eat is grown in the same region).
Table 6, "Population dose commitments from the airborne releases of model 1000-MWe power plants as a function of food intake", is the most interesting table in the article. It shows what parameter is most key to which power source produces greater population exposures: percentage of food eaten that is grown within the same region as the population. At 50% or more local food consumption, which is assumed in table 5, coal always exposes the regional population to a higher radionuclide dose than nuclear reactors. If people were eating 30% or less locally grown food -- which admittedly does not seem likely -- then the region's population could be exposed to more radiation from a nuclear plant than from an equivalent coal plant.
Why do nuclear reactors initially emit so much more than coal plants, measured in curies of radionuclides, yet generally expose human populations in the region to less of a body burden? Why does the relative population exposure ordering of nuclear power and coal change depending on how much locally grown food those populations consume?
The answers lie in the chemical and biological behaviors of the different radionuclides that respectively dominate emissions from reactors and from coal plants. "Radium-226 and radium-224 are the major contributors to the whole-body and most organ doses from the coal-fired plant. Assuming that the deposited radionuclides could enter the food chain, ingestion is the main exposure pathway for the population dose commitments from this plant (93 to 96 percent for the whole body and most organ doses, 83 percent for the bone dose, and 62 percent for the lung dose). ... Carbon-14 is the main contributor to the whole-body and most of the organ doses from both nuclear plants. Ingestion is the major exposure pathway."
Radium is chemically similar to calcium, so it is taken up by plants via the same pathways that take up calcium. It gets stored in the bones of exposed humans. Carbon of course is a major part of human and plant dry mass and also gets stored in organisms. These key isotopes are chemically available to plants and processed as nutrients in the bodies of both plants and humans.
Those thousands of curies of noble gases initially released by the nuclear plant? They matter orders of magnitude less when it comes to human exposure. Those gases can't chemically react with anything and diffuse throughout the whole volume of the atmosphere. Plants don't concentrate them and the human body can't store them.
The much less significant exposure route, "immersion," basically means exposure to radionuclides in the air all around you. Under most assumptions the immersion route will deliver a lower exposure to a regional population than ingestion via food. But if people in the exposed region are eating 30% or less locally-grown food, then the immersion route can become dominant and can lead to higher population exposure from nuclear power than from coal power.
The Scientific American article does not touch on any of these interesting points. It paraphrases the original in a way that actually introduces mistakes. It does not explain why some reasonable assumptions lead to lower population radionuclide exposure from nuclear power than from coal power. The broad summaries remain similar but the mistakes and simplifications lend the Scientific American article an unfortunate air of "here's a nice simple conclusion to bash your friends with the next time they fret about radiation from nuclear power."
Maybe I should have phrased my original objection more strongly: the headline of this HN piece, the original headline in Scientific American, and many commenters writing here are unambiguously wrong about certain points. Coal ash is not more radioactive than waste from nuclear reactors. Ordinary commercial reactors, operating normally, emit far more curies of radioactive material than is present in the fly ash emitted from coal plants. The effective exposure of populations to radioactivity is however lower for reactors than for coal plants due to the differing chemical/biological characteristics of the different radionuclides emitted. That's pretty interesting! But that key point which produces the counterintuitive lower-effective-exposure result is completely lost in the SA article. Over the past decade I have mostly seen this Scientific American article used as a club to bash people who "just don't understand" nuclear power. It's a sad triumph of tribal affinity over comprehension.
Coal is certainly far worse than nuclear power when you broaden the criteria beyond radionuclide release. Most of the world's coal plants are still operating without state-of-the-art pollution controls for mercury, acid gases, and particulates. Even with modern emissions controls for acute pollution hazards, coal emits a lot of CO2 for each MWh generated. But the overall superior environmental and human health profile of nuclear power should not tempt people to spread falsehoods in its defense.
Instead so far they seem to push the externalities to others by either using coal and spewing it all into air, or not using coal, but buying balancing power from elsewhere, where they do use nuclear power or coal power :).
I think we can agree both of these alternatives are morally dishonest.
Also, stop importing energy from other states, while essentially banning energy development in your own (cough... Calfornia).
Chernobyl and Fukushima are two exceptional events. There are hundreds of reactors out there both civilian and military that have never had catastrophic faults.
Plus, as bad as radioactive incidents can be, they don't affect the climate. Once the radiation dies down or is mitigated through clean-up efforts there will be no lasting impact on the world. In other words, with the right insurance policy you can recover from a disaster.
Hiroshima and Nagasakia were both deliberately targeted by nuclear weapons and are still inhabited. Cleaning up the mess can be expensive, but it's not impossible.
Cleaning up the mess made by a coal plant is basically impossible, the effects are too far reaching.
Seriously? "Expensive but not impossible"?
I'd take a contaminated room any day over a whole planet slowly cooking itself to death.
Yeah, because technology advances magically if you will it enough...
Thus it builds up in our ground water and atmosphere with no warnings and little community notification.
To be clear, my suggestion would be to google "category error".
Not to mention the waste fuel containing stuff that never existed before being produced in a nuclear power station
No. Coal spews out constant radioactivity. Nuclear emits nothing during normal operation. Even counting Nuclear disasters, emitted radiation and deaths per kW/h are lower for Nuclear. It is the safest per kW/h, period.
> Not to mention the waste fuel containing stuff that never existed before being produced in a nuclear power station
Which is much easier to contain then fly ash.
Now let's talk about costs. Not idealized costs, but actual, non-hidden costs of building, maintaining, and decommissioning nuclear power plants (oh and storing the waste in effective perpetuity).
What's that you think decommissioning would be too expensive and want to keep running the plant extra long with upgrades? Oooops, that's how you get Fukushima.
The only reason we aren't upcycling the 'waste' is because of an order issued by Jimmy Carter to prohibit the proliferation of Plutonium; we could be using it as fuel in other reactors.
This is why the real lesson is to build defenses in depth, with enough variety to protect against common mode failure. Fukushima Daini - only 12km away from Fukushima Daiichi - suffered similarly from the earthquake and tsunami, but a single surviving pump (and the skill and effort of the staff) was enough to prevent Daini from suffering the same fate as it's sister powerplant.
Good engineering asks questions like, "Are we safe when (not "if"!) the entire bank of backup generators fail?", or "Can a single event cause catastrophic failure?"