Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Power plant executives to face Fukushima charges for first time (washingtonpost.com)
42 points by janisS on July 31, 2015 | hide | past | favorite | 33 comments


Wow, the concept of citizens' panels sounds awesome. I wish we had something similar in the US. It's the perfect solution for dealing with police shootings, where prosecutors are reluctant to indict an officer (since they need to work closely with police in their everyday work).


A pre-trial in the court of public opinion? I thought that is the purpose of grand juries. After all, a grand jury would indict a ham sandwich. (Though not all pork, evidently)

On one hand, it's a massive waste of judicial bandwidth. On the other, it's an effective check on the coziness between AGs, DAs, and law enforcement organizations.


The difference seems to be that a US grand jury can indict on a prosecutors motion (and even, IIRC, in certain cases independently), but no one is obligated to prosecute based on an indictment once it is issued (obviously, one sought by a prosecutor will probably usually be followed up on by that prosecutor.)

But, from the description in the article, when a citizens' panel finds that charges should be issued after a prosecutor has decided not to prosecute, the case is reopened by prosecutors. Assuming this isn't just a reconsideration where the prosecutor can again decide not to actually prosecute the charges, this is a fairly substantial difference.


Isn't this the role fulfilled by Nancy Grace? When she decides somebody needs to be prosecuted, she talks about it on her show until charges are filed. (And then pronounces the defendant guilty.)


Or as we know it in the UK trial by tabloid :-(


On the other hand, japan also has the concept of "guilty until proven innocent". I don't know if and how it applies in this case, but you might be making a monkey paw wish there.


If history is any guide, they'd mostly be used to indict minorities for crimes when prosecutors wouldn't go forward with the case: https://en.wikipedia.org/wiki/Lynching#United_States


NOVA's documentary on Fukushima [1] is a very informative watch and shows the immense challenges faced by TEPCO.

I don't think they will be convicted.

[1] - http://www.pbs.org/wgbh/nova/tech/nuclear-disaster.html


The blame should not be with the operators, but with decision makers who signed off on a fatally flawed plant design. Really, you'd think it was common sense: if you absolutely need these generators in case of a disaster, like a flood, don't put the generators and their gear/infrastructure at a lower flood level! Much the same mistake was repeated many times in Houston, and came to light after Hurricane Ike.

Other things that came to light that were shown in the video:

Completely passive backup cooling systems that were dependent on actively-actuated valves for proper functioning. (Should have defaulted open in case of a power failure.)

Dependence on sensors that would fail without power.

Dependence on sensors that would provide dangerously misleading readings in precisely the most dangerous situation. (Water already boiled away)

I think it's a good video to watch if you are doing operations of any kind that requires worst-case thinking and planning.


Common sense isn't common.

But the issues you mention (and others) have been solved by the aviation industry. Nuke plants and the Deepwater Horizon oil rig could have benefited immensely from consultation with airframe engineers. A lot of those faults that doomed them could have been inexpensively corrected.

I always remember the backup generators in New Orleans that were put in the basements. Precisely the time you'd need the backup generators was when the basements were flooding. Oops.


In hindsight it is obvious, but at the time of construction, you would have to justify spending money to protect against both a 9.0 earthquake (pretty rare) and a tsunami larger than one ever recorded at the same time. Which can only happen if you have the 9.0 quake right off your coastline.

At the time the plants were built, there was no geologist on the planet that believed Japan could even have a 9.0 quake, or a 30 meter tsunami (which you needed the quake for anyway). Thus, at the time, the plant was over designed for all possible scenarios.


This is why defense in depth is important. Even without anticipating a 9.0 earthquake, they could have anticipated that their floodwalls would fail for an unspecified reason, and desighned the plant to be resiliant to flooding in the event of flood wall failure.


The faults could have been corrected at only minor cost.


The people who lived near the plants lost everything, perhaps 80000 of them. I'm not sure what became of them, but I'm pretty sure no one came up and said, here is a check for the fair market value of your property. It was more like, you can live in this cardboard cubical in this community center now. I'm sure they've moved on by now.

I'm sort of surprised they weren't treated better. All of the nuclear reactors in Japan are shut down now, and the financial interests would like to see them turned on. But how do you expect public support with 80000 refugees from the meltdowns that are grumpy about it?


Global warming will cause far worse problems than Fukushima did.


Can someone kindly explain how the photo of Norio Kimura and its caption relates to the article?


I was wondering the same thing. It feels like the article is implicitly blaming the Fukushima meltdown for the tsunami somehow.


About time.


The article indicates that Fukushima was the "world’s worst nuclear disaster since Chernobyl in 1986" but let's not beat around the bush, by now many of us have come to the conclusion that this is the worst nuclear disaster ever as the distrubing ecosystem collapse [1] currently underway in the Pacific Ocean would seem to confirm.

When Chernobyl had its meltdown, they weren't pumping up to 400 tons of radioactive waste into an ocean every single day [2].

Either way, am surprised but delighted to see the Washington Post bring the issue to the forefront today.

[1] http://enenews.com/govt-official-chilling-report-pacific-oce...

[2] http://tass.ru/en/world/759657


There are 187000000000000 million gallons of water in the Pacific Ocean, meaning that "radioactive waste" is 2.1 x 10^-10% of the body of water itself.

The waste we're talking about is HTO --- tritiated water --- which is a low-energy beta emitter that has intrinsically low bioavailability, because it is literally just water and is eliminated quickly.

Before developing an opinion about how terrifying this radiation leak is, a good number to have handy --- exercise for the reader --- is over the 12 year half life of tritium, assuming 400 gallons pumped into the ocean every day, for 4384 days, what percentage of the background radiation of the Pacific ocean are we talking about elevating it to?

Another number, which will not make you feel better about the world, is what elevation to background radiation is produced by the coal plants it would take to offset all the power produced by nukes.

Finally: if you believe that HTO leaks from TEPCO are, or are going to be, responsible for mass die-offs of marine life, you're going to have to account for the fact that we basically carpet-bombed the oceans with HTO during the insane nuclear weapons testing of the 1960s; nothing TEPCO is doing will come close.


In an attempt to grasp how toxic the water is, wouldn't it be more effective to measure levels of cesium 137 instead of overall background radiation? I don't care what background radiation the entire Pacific has, I care if the sushi I'm eating has ionized, cancer-causing particles in it.

"Michio Aoyama’s initial findings were more startling than most. As a senior scientist at the Japanese government’s Meteorological Research Institute, he said levels of radioactive cesium 137 in the surface water of the Pacific Ocean could be 10,000 times as high as contamination after Chernobyl..."

http://www.nytimes.com/2014/03/17/world/asia/concerns-over-m...


Your analysis might confuse two different phenomena.

When you mention TEPCO pouring "400 million gallons" into the Pacific, what you're talking about is them dumping contaminated cooling water from tanks into the ocean. The scale of that dumping is caused by (a) the ongoing need to pump water into the compromised reactor to cool it and (b) the large amounts of water they've already stored. However: that water is also filtered, to remove the (actually dangerous) Sr-90. What's being dumped into the ocean is HTO, not Sr-90 or Cs-137.

On the other hand, the meltdown at Fukushima contaminated the entire area with Cs-137, most of which is in the soil, sediment, and sand. The Cs-137 contamination is much worse than the HTO contamination. However, it is also not ongoing; in fact, increase in cesium detected around the plant has fallen dramatically in the last two years.


Fair enough - though I'm not sure if I will just take your word for it that the potential for new releases of new cesium 137 is not an ongoing threat from Fukushima Daiichi. The only way we can know for sure I suppose, is through the efforts of independent researchers brave enough to get close to the facility.

In the spirit of HN, it would be neat to see an open technology solution for the purposes of monitoring. Ie:

  if(waterSample.cesium137 > 0.001) return ALARM(waterSample)


> it would be neat to see an open technology solution for the purposes of monitoring

It's trivial to do so. Decay of Cs-137 releases a 662 keV gamma ray that is easily measured and the count rate is proportional to the source activity (or ultimately the total amount of Cs-137 present). You can calibrate an inexpensive NaI detector such that it will tell you how much Cs-137 is in a given volume of water. If you place it next to a pipe that has a constant flow rate, you can infer the average amount of Cs-137 in the liquid flowing through the pipe. It's something that you can build in an afternoon if you know what you're doing and have the equipment.

This is pretty much how they monitor liquids for contamination in a real plant, except they use more detailed spectral measurements to monitor multiple isotopes. If you ever have the rare opportunity to go into a reactor control room, there will be a display somewhere that reads out this exact measurement.


Good info!

If the technology is cheap, as you point out, then the next logical step might be a collaborative project to get a network of inexpensive, miniature, buoyant craft's out to sea - for the purpose of actively measuring levels of cesium 137; sharing these results for everyone to see, to graph, and to check on at any given time of the day.

One-time results from a fish is useful data but to have a whole swarm of devices actively monitoring levels in various locations would be ideal.

Thinking ahead, the next hurdle could be the logistics of internet connection - maybe they could connect to each other in a mesh-network that daizy-chains back to an internet connection closer to shore. Oh, and power (solar panel maybe?). Navigation. Yeah - some challenges for sure, but it all seems within reason.


Somebody did this. Before the experiment, they expected a background level of Cesium 137 of between 1 and 2 bq/M^3 (caused by 1950s nuclear testing, which released a ton of it). They found actual levels between 1 and 2 bq/M^3 across the Pacific with some small variation above and below: http://ourradioactiveocean.org/results.html

Think about it this way. Worst-case estimates are that 2-4 kilos of Cesium 137 were released. If it all ends up in the ocean, the total Pacific Ocean weighs on the order of 638,000,000,000,000,000,000 kilos. The total new Cesium 137 is: 0.000000000000000000627% of the ocean. At a 30 year half-life, that's one decay event per 70 liters per hour.


Thanks for the headsup - but there's nothing wrong with some competition and not to mention getting more coverage; for example it doesn't look like they have any data near the coast of Fukushima preficture which is the most important spot. Also the monitoring does not appear in realtime and if we are getting nitpicky there are certainly a few UX issues with how the data is presented.

On a related note, I'm sure Ken Buesseler is a good guy and his site is nice and I have nothing bad to say about him; I don't know him - but you can find comments about him from people who claim he's a "nuclear shill" and so regardless as a matter of principal it would be prudent to get more independent researchers involved in gathering data.


> but you can find comments about him from people who claim he's a "nuclear shill" and so regardless as a matter of principle it would be prudent to get more independent researchers involved in gathering data.

You can also find a large number of people on the Internet who claim Obama is a lizard alien (http://www.ibtimes.co.uk/polling-theory-embarrassment-453096). This comparison is hyperbole, I know, but his data and methodologies are peer-reviewed and support his claims. He's far from the only person studying the issue, it's a hot topic (pun intended) https://scholar.google.com/scholar?hl=en&q=fukushima+radiati... .


I think the exact position of the dumping matters though. Unless the ocean is well mixed, the concentration will not be uniformly distributed. For instance, the entire stream might stay integrated as it travels along deep ocean currents (not saying that's actually the case by any means).


Not going to happen, nor is there much reason to do so. For one, the NaI detectors are relatively inexpensive, but still you're talking about $10K each minimum for such a setup. There are other detectors that would work for this (EJ-309 for instance, you need a detector capable of energy spectroscopy), but they're even more expensive and often extremely toxic. What I was describing is a setup to measure contamination of the water while it's still at the plant.

These sorts of statistics are monitored and the data is available ( http://www.biogeosciences.net/10/6045/2013/bg-10-6045-2013.p... ), but the contamination levels are so low that you have to use a very different method. Basically they go out and collect a bunch of sea water (linked report uses 100 liters), then filter it through a microfilter and the filter, which traps all of the cesium, is measured back in a laboratory using a very sensitive HPGe detector (which cost $100K+ and require cryogenic cooling at all times).

The measured activity levels in that report were in the range of about 1-15 Bq/m^3 of sea water. That's 1-15 atoms disintegrating per second (there's ~10^29 atoms in a cubic meter of water). Those levels are far too low to reasonably measure at sea, end of discussion. Even the highest observed dispersal of about 140Bq/m^3 (which was actually Cs-134 and was in a small area immediately after the disaster) would be very tough to measure in real time like you suggest.

Much more practical is to monitor the rate of dispersal at the outlet (i.e., the pipes pumping water into the ocean at Fukushima) and simulate the fluid dynamics to calculate the dispersal. These results you then validate by comparing to spot measurements taken in the manner described in the linked paper. In fact, this is again exactly what they do. This work is widely published and openly available, but not well known to the general public because it's rather technical. My own research work is on a related subject, but different application (finding the position of a concentrated radioactive source in an urban environment).


Ok, thanks for clarifying. I guess that ups the challenge quite significantly.

Your research sounds interesting and vitally important - the kind of work that could help save lives even; at least warn people of nearby dangers. Do you have anything public on it?


Nope, still in development. It'll be freely available once it is. My work wouldn't make a particularly good read anyway as it's intended for a specialist audience and the mathematics get pretty involved.

For radiation detection in general though, probably the most approachable introduction is this book http://www.amazon.com/Techniques-Nuclear-Particle-Physics-Ex... . You can find a free PDF copy if you google around a bit.


Cool thanks and goodluck with your project.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: