I am appalled by this. Professionals shouldn't be doing this
Unfortunately, the gut response "I am appalled; this must change" by folks who don't know anything about how the lab operates (i.e., lay citizens and politicians) exerts pressure toward the latter, not the former.
The evidence was pretty solid that safety standards in the workplace are considerably higher than at home. Even relative to other industries performance was good.
This is only opinion, but I thought:
1) This was a good state to be in
2) It was bought on by ignorance of lay people and politicians who didn't understand relative risk
3) The situation reflected well on our political system functioning well without being very logical
It drives me crazy that we hold nuclear to a higher standard again even though the cost/benefit ratio appears to be better than many other industrial and non-industrial activities (eg, driving cars, using coal). People will cheerfully say 'but nuclear ... causes X', but rarely compare that with the benefits that were bought for X.
I once tried giving hosts Arabic names. It actually worked pretty well since the naming system lets you express inheritance and works better when spoken than "proxy X on testnet Y" or "proxyX dot testnetY" (especially when not everyone speaks English as their first language and some people may be joining the call from a cell phone in a car).
I don't think it would scale to an IT larger than ~10 or so.
Naming is the most vexing aspect of programming, so you really must share more, even if just to deflate expectations.
Edit: Answered my own question:
> keeping bits of plutonium far apart is one of the bedrock rules that those working on the nuclear arsenal are supposed to follow to prevent workplace accidents. It’s Physics 101 for nuclear scientists, but has sometimes been ignored at Los Alamos
The basic idea is that these sorts of objects are routinely seeing spontaneous nuclear decays -- there might be billions per millisecond of these. What's important is that each nuclear decay is an explosion which releases debris that can cause other nearby nuclei to decay as a result. So it becomes important to know: what is the average number N of nuclei which fall apart as a direct result of getting hit by the shrapnel of one decaying nucleus?
Since nothing is terribly exact in physics there are two regimes to consider, N < 1 and N > 1. In that first regime we can roughly calculate that we need to multiply this baseline billions-per-millisecond rate of spontaneous decays by the number
1 + N + N² + N³ + ... = 1/(1 - N)
For N > 1 the same multiplier holds but it cannot be summed infinitely: instead we realize that each term takes a slightly longer time period and thus the growth goes something like e^(k t) for some k, it's an exponential growth towards a majority of the sample reacting.
For fissile materials like plutonium, an easy way to increase N is to just bring two plutonium rods closer together: free debris from explosions in the one now cause new explosions in the other. This is called forming a "critical mass" hence the language about "criticality". Another way is to bring in these "neutron reflectors" that reflect the debris back into the same sample, it's basically the same principle.
Plutonium, not so much.
I think it's in mentioned in Richard Feynman's account of his time in Loa Alamos in 'Surely you're joking Mr Feynman'.
> On May 21, 1946, physicist Louis Slotin and seven other Los Alamos personnel were in a Los Alamos laboratory conducting another experiment to verify the exact point at which a subcritical mass (core) of fissile material could be made critical by the positioning of neutron reflectors.
> It required the operator to place two half-spheres of beryllium (a neutron reflector) around the core to be tested and manually lower the top reflector over the core via a thumb hole on the top. As the reflectors were manually moved closer and farther away from each other, scintillation counters measured the relative activity from the core. Allowing them to close completely could result in the instantaneous formation of a critical mass and a lethal power excursion
> Under Slotin's unapproved protocol, the only thing preventing this was the blade of a standard straight screwdriver, manipulated by the scientist's other hand. Slotin, who was given to bravado, became the local expert, performing the test on almost a dozen occasions, often in his trademark blue jeans and cowboy boots, in front of a roomful of observers. Enrico Fermi reportedly told Slotin and others they would be "dead within a year" if they continued performing it. Scientists referred to this flirting with the possibility of a nuclear chain reaction as "tickling the dragon's tail", based on a remark by physicist Richard Feynman, who compared the experiments to "tickling the tail of a sleeping dragon".
An aside: It's this kind of shit that scares the crap out of me with regards to nuclear energy. Some - literal - cowboy completely disregards the rules, endangers him and his colleagues, and then someone gets injured or killed. All for the benefit of showing off.
We should do better than this, but incidents like this give me little faith.
Unsurprisingly, this experiment killed Slotin:
> On the day of the accident, Slotin's screwdriver slipped outward a fraction of an inch while he was lowering the top reflector, allowing the reflector to fall into place around the core.
> [Slotin] received a lethal dose of 1,000 rad (10 Gy) neutron and 114 rad (1.14 Gy) gamma radiation in under a second and died nine days later from acute radiation poisoning.
When I look at the steps I used to go through multiple times a day when working in Cat 2 and Cat 3 biological containment facilities to work with cells and pathogens, including double door airlocks under negative pressure, safety hoods, protective gear etc., stringent aseptic practice LANL seems quite lax in their practices. We all watched ourselves and each other for bad technique and careless infractions to maintain that discipline without the need for dedicated separate safety inspectors (though periodic inspections did occur). That good practice was also tied into self-preservation of self and others when working with dangerous stuff; no one wants infection by some horrible pathogen. It seems quite bad to be working with even more lethally hazardous materials with basically zero protection, and the description of the attitude of the management 2011 and the present day towards safety seems to be equally cavalier.
In all the academic and industrial environments I've worked in, we had rigorous inventory and tracking of all dangerous materials (biological and radioactive), so it seems odd that LANL fails so badly here. This was the first casting in four years, and they immediately failed: why wasn't there preparation and planning for moving the material before the casting even started? Is Pu randomly stored around with place? Is there no oversight at all? In the lab I currently work in, we have to track every last trace of radioactivity (mainly P, N for biological labelling I think; I'm not involved directly) and account for it for statutory reporting every quarter, and if you fail to do it properly there would be a massive investigation and you would be banned from working with it; the lab managers make a bit deal of it, and rightly so.
It's a little ironic that safety has been compromised in order to meet production targets, but this resulted in a complete shutdown. Had they worked safely and sensibly and avoided the shutdowns, they would have overall been vastly more productive even if this was slower than the management would have liked. I've seen this pattern several times now in multiple places, from factories, to research laboratories to software development. It all comes down to unrealistic management goals from the top which dictate working at a fast place with attendant quality and safety problems no matter that a better end goal could be realised by working at a slower place with a little more care and thought. We see the same problem every time software design or implementation is compromised by a tight deadline with no scope for doing it the right way for the longer term, purely to meet some unimportant (in the greater scheme) short term deliverable.
Slotin got himself killed after the war was over, but it was less than a year after, and I imagine that sort of "get it done at any cost" culture does not change overnight.
I wonder if that attitude is in fact the root of the trouble today. Especially since I bet that the "save the nation and damn the risks" attitude came back for at least a decade or two once the Cold War ramped up.
I've seen different cultures in various labs I've worked in or had contact with, and it can vary wildly from being extremely disciplined to extremely sloppy and both practices can be picked up by new people from the individuals concerned.
Thank god for solar
It's pretty scary how mundane they look.
I worked in a startup where they had put the Systems Operations (as in, Linux datacenter guys) under the Chief Marketing Officer for about a year for (reasons).
Three years later, we are STILL cleaning up after that mess. Wrong management creates problems that aren't so much a result of ineffective management... wrong management seems to always move resources away from things that most engineers would call "normal and customary processes" (like patching software or updating libraries that software is dependent upon) and that's how you end up with Equifax. Or a smoking hole in the desert somewhere in New Mexico.
Unqualified people making bad decisions are always fatal.
That sounds almost suspicious. It's hard to actually articulate that suspicion, but nuclear material is involved here.
From earlier paragraphs:
> In 2013, [officials in Washington] worked with the lab director to shut down its plutonium handling operations so the workforce could be retrained to meet modern safety standards.
> Those efforts never fully succeeded, however, and so what was anticipated as a brief work stoppage has turned into a nearly four-year shutdown of portions of the huge laboratory building where the plutonium work is located, known as PF-4.
My first thought is "front for clandestine operation." How, or what, I don't know; I don't even know if something like that would be viable. If a TLA wants to play with nuclear stuff, couldn't they make their own base, or would they need an existing one? And if they did need to use these kinds of facilities, surely they'd be able to keep the media out.
Okay maybe I'm wrong. I'll leave this comment here as a suggestion that maybe this line of thinking isn't correct after all. (I've been figuring this out as I've typed.)
> "What's in section PF-3?"
> "That's classified, I can't discuss it."
> "What's in section PF-4?"
> "That's where we used to work on plutonium for DoE and IAEA but we had a pattern of problems there and it was shut down."
(Along with, uh, "do not attribute to malice that which can be explained by manglement")
Within 12 months of me starting there everyone more senior than me had scattered elsewhere. Soon I was gone too. This happened at every level and in every division. The amount of know-how lost just to forced retirement is incalculable...
So sad to hear things are not getting better there. I was very proud to be a part of it for even a little amount of time.
I left in less than a year for another job. Part of me wishes I would have been able to pull through, but part of me thinks I wouldn't have had a job in a few months. There's no way to really know. Going there for the first time was a genuinely magical experience. I'll never forget it.
At the end of the day, change is good. But more consideration needs to be put in the complexity you find at national labs.
I heard great things about LANL during my job. I hope things do improve over there.
In the old days the labs were run by stoic multinationals like duPont now we have the fly by night bidders like Dyncorp and Fluor running our crown jewels.
Race to the bottom if you like.
A rather strange process for something so important.
In budget planning contractors disappear at the end of a project, which is very useful for budget planning purposes. But in most cases it’s not like those contractors are suddenly without a job, except in rare cases like shuttle workers. Rather the agency has fixed and flat budget, so when one winds down a new project starts up. And it is no surprise that in nearly all cases the skill set allocation of new positions roughly equals those going away. And the same people fill those roles, generally. (Again, the shut down of shuttle and forced retirement of many workers is the exception to this.)
During my time there was a competitive rebid of the contract had. We won, but during the process the competition approached many of us on the contract individually, letting us know what sort of compensation they would be willing to pay for us to jump ship if they won, to return to the same job wearing a different patch on the sleeve. My civil servant “boss” (air quotes because he wasn’t my manager on the org chart, but in day to day reality he effectively was) pulled me aside to let me know that no matter who won I’d still have a job. It was actually the same thing my employer did when they first won the contract from someone else. I wasn’t there at the time, but many of my coworkers were grandfathered in that way.
While what you are saying is correct, it might paint the wrong picture for someone not having our shared experience.
Watch  because reading the rest just make everyone dizzy. Our launch protocol relies on very outdated technology. Not that there is anything wrong with old technology in general (as long as the protocol is well-established)...
We don't even want to bring up the recent unfortunate Navy crashes... we ought to really step up.
"Shoot yourself in the foot."
Plus, being so dated, far less likely to get hacked. Might've been a joke.
what matters is how you use the old tech.
An example of this might be something like "all plutonium rods are individually stored in locked lead boxes". (I'm no nuclear physicist, this might be a terrible idea, but you get the point).
It sounds to me like the leadership at this facility don't understand that. These same things keep happening. They don't change their processes enough to prevent it.
That's coming. Nuclear generation will only shrink from now on. In the solar/wind/natural gas context, the public refuses to pay more for nuclear. With decreasing need for fuel as existing power plants succumb to age and poor management, eventually these labs will be seen as the hideous dangerous boondoggles they have always been.
From reading the ScienceMag article that nerdy linked to, I learned that PF-4 has been unable to make new weapons for 4 years due to their safety staff quitting. I joined some anti-nuclear protests before, but those protests aren't targeting any individuals.
Meanwhile my best-case scenario (shutdown of weapons production) is happening, because the staff quit due to bad management.
Good things are happening for the wrong reasons!
We've seen just about the worst that can happen, TMI, Cherynobyl and Fukushima. All three were due to human error: mismanagement of the system and/or mismanagement of the mitigation afterwards. These were terrible things nobody would want in their town, but they are manageable and bounded.
Compared to the global disaster in progress, humans burning things, a nuclear plant mishap seems preferable, no? Globally, we're facing a mass extinction, more intense weather, and loss of low elevation islands and cities everywhere. What's worse is the effect is not bounded, it will continue to get worsefor centuries as we burn more stuff, for the whole planet.
Is this math not compelling?
"terrible things nobody would want in their town".
Where the natural risk is higher (e.g. Fukushima), my answer is "just say no". Where the risk is manageable (deserts, a bunker under the Alps), I don't mind so much.
Nuclear is better than coal or oil, by a long way. Renewables would be great, but only hydro has a real chance to be competitive at this stage. So maybe nuclear is needed in some less-populated, geologically stable regions. I trust location more than any human safety protocol, because of the reasons in the article.
This is a world of trade offs. It is easy to point to the bad,m in one approach, but you must also evaluate the bad of the alternatives.
The staff effectively said "if our work is given so little heed here, we'll go elsewhere where it is" -- a classic brain-drain mechanism (and a frequent response to a declining firm or corporate culture).
That's also effectively a mechanism of the generalised concept of Gresham's Law -- applied not only to money, but to any quality valued (or costed) differentially, whether in one or multiple markets. A parallel that dates to the earliest descriptions of the phenomenon -- Greek playwright Aristophanes in "The Frogs" describes the behaviour as common to both coin and politicians, an observation repeated by American journalist H.L. Mencken in the 20th century, see his "Bayard vs. Lionheart".
A high-quality (and high-cost) team saw low professional rewards at Los Alamos, and decamped for greener pastures. Brain drain occurs for various causes, and not just compensation, but if the conditions for work, rewards, or oppression are discouraging in one location, the talent will generally go elsehere.
In WWII, much scientific (and other) talent, including much of that which developed the U.S. nuclear programme fled Nazi-occupied Europe. From the 1950s through the 1970s, and to an extent still, black artistic, musical, and business talent leaves the U.S. for Europe, for much the same reason: to escape oppression, and to seak greater opportunities.
(Talent flow between subnational regions, industries, academia and business, etc., follows similar patterns.)
The US forgot how to make nuclear weapons. It's been decades since the US built one from scratch. For about 20 years, the US lost the capability to make H-bombs. There's some special material required, and the 1950s factory to make it had worn out. An attempt was made to make it by a cheaper new process, and that didn't work. A plant using the old process wasn't funded for decades.
The US has way too many old nuclear weapons, and has been overhauling them during this period. There's no shortage. The fissionable parts don't wear out, but the tritium has to be replaced every decade or so.
This is a tragedy waiting to happen.
These men are heros, risiking their lifes to shut down LANL's production of nuclear warheads
There are cultural issues, I remember reading something about "mid career" mech engineers being five years exp. Which is great if you graduated at age 55 or so, not so good if you graduated at 22 with loans that will take more than 10 years to pay off. Meanwhile theres a handful of "lifer" boomers who clog up the promotion pipeline until death at which point the talent pool is empty between the ages of perhaps 30 to 60.
Its a poor conditions, underpaid, temp job, more or less. Insert surprise that people would rather work anywhere else.
If you've ever worked in an industrial or laboratory setting, you'll be familiar with risk assessments, hazard levels, and the attendant working practices that accompany them. Sometimes it's taken to bureaucratic extremes, but it's always important to follow them strictly, because as soon as you start ignoring them and taking shortcuts, you're no longer working safely and you're endangering yourself and your co-workers. That's complacency, and it's a bad place to be. I've seen a co-worker grow complacent about biohazards, and they ended up in hospital with a nasty tropical parasite infecting them, all because after several years of strict discipline, they grew complacent about the danger because it's invisible and you get sloppy in your well practiced routine (I assume; even they don't know exactly how it happened, but it was almost certainly due to sloppy working practice). The same applies here; this is very dangerous stuff but it looks innocuous and working with it leads to trivialising the danger and working unsafely. In a well managed environment, this should be being picked up on quickly by co-workers and inspections. Where I've worked, any violation would mean a formal report up to line managers and lab managers with appropriate disciplinary action. And I have done so when I encountered it, for the safety of all of us. Safety culture needs to be ingrained so that it's second nature.
One angle to look at this is that "material transfer" is actually the primary activity taking place in most industrial and laboratory settings. When I worked in a food/drink industry lab, the logistics of the whole multi-stage process from input raw materials, processing and production, to packaging materials, packaging, warehousing and distribution were all carefully planned and controlled (by an AS/400). When I worked in pharma, all the compounds were in a central robotic compound library, and everything checked in or out was controlled, and all operations in the laboratories were automated and controlled. What I'm trying to say is that the logistics of material handling, inventory and transfer have been solved in many industries for decades. There's no question of where inventory lies because it's all recorded from start to finish. You don't have a hold up because the warehouse is full, you run out of packaging materials, or there are no empty tanks to fill from the previous step in the pipeline, because you have a total view of the process and can plan all the logistics to manage the process optimally. Random materials are just not lying around to clog up the process. There's a managed process with careful oversight and record keeping, and a safety culture ingrained into all workers and management from the start. LANL seems to be very backward in these respects.
Natural uranium is pretty unremarkable. The greatest danger from it is heavy metal poisoning. It's also common enough that restricting its availability would be rather difficult.
Strictly speaking, if you got enough of it and were sufficiently determined you could create an improvised nuclear reactor with it, something along the lines of a carbon-moderated design. But the risk from that would be fairly minimal apart from some local radioactive contamination.
Perhaps they should use software to track the movement of materials which would forbid the move from taking place when the end result would be dangerous?
This begs for "Nuclear Material Blockhain"! ;-)
OT, but I've seen this phrase several times, and I have no idea what it means. The two component words seem contradictory, like "preppie burnouts" or "considerate bullies". Is this irony? If so, what are we attempting to indicate about firms staffed by these people? Simply that they employ multiple groups who annoy us?
"[those who] reject the culturally-ignorant attitudes of mainstream consumers ..."
"[those who tend] to mimic stereotypical "jock", "bro", or "cool" culture in combination with the egotism, insensitivity, and terrible humor of "nerd" culture"
Combine the two and you get:
Narcissistic jock bros that are also egotistical insensitive nerds
A new software is just another thing they'll turn off, not enter data into, or bypass for convenience.
You need to code them a new culture.
In the factory I worked at, the physical card was required to be physically handed over from department to department, and the information also put into the computer before the next processing stage could proceed. This was enforced; I once was part of a panic when some of the "tank cards" (which were the physical tokens representing the contents of storage tanks) went missing. They were found in a lab coat pocket after some frantic searching. But because the rules were enforced, this did block the physical process. And further delay and we'd have got an internal fine of £10000 which was another incentive for not losing track of them!
Famous (in new Zealand) anti nuclear weapon speech during a debate at Oxford Union.
I feel like browser manufacturers need to make "This website would like to embed itself into your browser and send you push notifications indefinitely" pop-up a little bit more scary if websites are getting this scummy.
But pretty much every other prompt has been obnoxious.
We've got corporations with global ERP/inventory systems...yet these guys can't keep track of stuff within one facility?
These people aren't responsible enough to be working in the nuclear industry.
Some analysis from a former LANL chemist: https://nucleardiner.wordpress.com/2017/06/19/a-critical-pro...
Would you drive across the intersection through a red light, in violation of the rules? Maybe the risk varies during the day as the traffic level varies, but whatever the time, there's always the chance of a collision if you go through a red light, but almost guaranteed safety if you stop at the red and wait for the green.
The same considerations apply to dangerous situations at work, such as this case. The hazard is severe, and the risk is high. You avoid the problem by following the rules and working safely. It doesn't matter that they "weren't even close to critical", the situation should never have had the possibility of occurring in the first place. You don't want to rely on a probably OK, you want absolute certainty or as close to that as you can get, otherwise you're playing the odds and it is only a matter of time until there's a serious accident. That's not how things work in serious settings. Just as you would (I hope) not take the wholly unnecessary risk of running a red light, you wouldn't take unnecessary risks with potentially critical masses of Pu.