Hacker News new | past | comments | ask | show | jobs | submit login
Demon Core: The Strange Death of Louis Slotin (2016) (newyorker.com)
46 points by BerislavLopac 56 days ago | hide | past | web | favorite | 22 comments



I've always found this story fascinating, because it's such a clear example of the role of operator error in safety. Both Daghlian and Slotin were experts in the work they were doing, and both were well aware of the danger of getting it wrong. Still, they made a mistake. As people do.

On one hand, the mistake was the effect of time and schedule pressure. Some of that was real, but some also illusory (as shown by the fact that Los Alamos could stop doing those experiments entirely as still deliver). They chose the approach they did because it was the easiest. But not only that - at least in Slotin's case he chose the approach he did because he didn't believe that he would make a mistake. He'd done this a bunch of times. The danger had become routine, an idea captured in Diane Vaughan's books as "normalization of deviance", and in economic and sports research as "the familiarity heuristic ("I know this, so I'm safe").

On the other hand, the experiment itself set them up to fail. Minor tweaks with near-zero cost impact, like bringing the tamper up from the bottom rather than down from the top, would likely have saved both men. My understanding is (and it's hard to get clear evidence of this) that both men designed their own experiment. Both were in a position to do it in a safer way with no additional cost. With hindsight, both likely would have. In Slotin's case it seems like the commitment heuristic ("I want to be consistent with my past actions") played a role in him doing something he knew to be dangerous "one last time". In Daghlin's case, there seems to have been some role of the scarcity heuristic ("if I don't get this done tonight after the party, I might not get another chance").

This all goes to show how fallible we are. Not only these guys, both extremely smart people. We take irrational risks all the time. The big take away here is that systems need to be safe even though people are going to make bad decisions for bad reasons. One extremely effective way to do that is to separate design from implementation, using our rational decision making processes to make the hard decisions ahead of the moment. Then write the decisions down. Then follow them in the moment.


A more mundane and perhaps relatable version of this is the relaxation of key automobile driving skills once the operator has become comfortable. It is not unusual to see someone performing maneuvers that are only "safe" under assumptions that cannot be strictly true but are statistically likely to be true (e.g. there is nobody coming the other way around this blind corner, there is no cross traffic this late at night, whatever). I doubt people consciously think of it as a dice-roll every time they do it, but it is. Over time you may get comfortable doing things simply because nothing bad has happened but if the assumption fails you will likely crash.


I can't seem to find the statistics, but I've always wondered if there's a secondary peak around ~1-2 years of driving experience.

Hypothesis: at that point you feel comfortable, but you haven't accumulated the full spectrum of experience.

I'd expect the relevant statistic would be accidents per person, by age. But everything seems to normalize per miles-driven (where no such effect is apparent) or against total accident rate.


> a clear example of the role of operator error in safety

>The big take away here is that systems need to be safe even though people are going to make bad decisions for bad reasons.

This is why I don't like the phrase "operator error". All too often, it's used to excuse systemic failings and avoid investigating the deeper causes of safety incidents.


> All too often, it's used to excuse systemic failings and avoid investigating the deeper causes of safety incidents.

Completely agree. But I don't think avoiding the term "operator error" (or human error, pilot error, medical error, whatever) is the solution to that way of thinking.

Instead, I think we need to keep repeating that humans, no matter how experienced, careful, knowledgable or well-meaning, are going to make mistakes. The job of system designers is to find ways to make systems robust in the face of those mistakes, and to help make it easier for humans to do the right thing.

Checklists are one powerful tool there, for example. Another one is making systems that behave like people's mental models predict. I wrote about that idea here: http://brooker.co.za/blog/2019/08/12/kind-wicked.html


Someone recommended Human Error by James Reason on HN: https://www.amazon.com/Human-Error-James-Reason/dp/052131419...

I'm not sure how comprehensive / modern it is (not my subfield), but I enjoyed it. And it provided at least one framework to think about error.

Specifically, that most errors can and should be categorized by the states necessary for their happening. Because the unique characteristics of each state (of which there are many) all suggest very different approaches to resolving or eliminating them. To bring it back to the example in question here, remediating the procedure to eliminate a lack of knowledge of failure cases or risk would not have prevented either of these accidents (both were well-informed). However, technical solutions to physically prevent unacceptably risky "bypass" procedures would have.


As you say, the fascinating part about these accidents is that the risk was informed.

There's a tendency to simply say "If we had more procedures, and they were followed, then this wouldn't have happened." But that seems like a dodge. More often than not (in my experience), more procedure is simply seen as overly-burdensome and actively subverted by users (in the service of laziness, performance, compensation, or deadlines). As you say, willfully making bad decisions for bad reasons.

Sometimes you have the benefit of almost absolute control over people, e.g. SUBSAFE [1] (or Navy or Air Force maintenance QA programs). But most organizations don't have that kind of training budget.

Consequently, the most effective procedure is the safest one that people will actually follow. So strike that balance.

[1] If you can get a hold of high-level documentation, an excellent example of error mitigation in practice -- https://en.wikipedia.org/wiki/SUBSAFE


Slontin was a cowboy (in the derogatory sense of the term) who thought bravado would keep him safe from intense radiation. The demon core incident wasn't an isolated incident, one of his less famous exploits involved swimming in an active nuclear reactor, which obviously irradiated him badly.

> "In the winter of 1945–1946, Slotin shocked some of his colleagues with a bold action. He repaired an instrument six feet under water inside the Clinton Pile while it was operating, rather than wait an extra day for the reactor to be shut down. He did not wear his dosimetry badge, but his dose was estimated to be at least 100 roentgen.[12] A dose of 1 Gy (~100 roentgen) can cause nausea and vomiting in 10% of cases, but is generally survivable.[13]"


That seems unfair. As a trained nuclear physicist and engineer, he was no doubt quite clear on the risks related to radiation.

Just because I choose to jump a motorcycle through a flaming hoop doesn't indicate that I think my body is impervious to flame.

It means I weighed the risks and made a choice.

In other words, ignorance, chance, and bravery are different things. And accidents are some mix of all of them.


Extreme risk taking is characteristic of young men. It's not so much a rational decision as it is a mind clouded by testosterone. Sometimes it's entertaining good fun. I enjoyed Jackass as much as anybody. The problem is when such men do it without consideration for the wellbeing of those around them. Slotin didn't, he regularly performed his "tickling experiment" with other people in the room.


Stories like these will forever remind me of Hisashi Ouchi who died horribly from a different kind of criticality incident in Japan in 1999 [0]. He had received 17 sieverts and they tried to keep him alive for as long as possible in a human experminet. I won't post links to articles about this as you can't unsee pictures like that, putting his name in a search engine will get you there.

[0] https://www.japantimes.co.jp/news/1999/12/22/national/jco-wo...


Also discussed a couple of months ago: https://news.ycombinator.com/item?id=20205876


The incident that caused Slotin's death was discussed a bit in a recent documentary: The Half-Life of Genius Physicist Raemer Schreiber (2017) https://www.imdb.com/title/tt4870510/

It's available on Amazon Prime.


This sounds a lot like Fat Man and Little Boy movie. The John Cusack character was killed by radioactive exposure from a mishap at Los Alamos. The movie is really interesting.

https://www.imdb.com/title/tt0097336/?ref_=ttfc_fc_tt


There is a replica of the Demon Core in the Bradbury Science Museum in Los Alamos.

A sphere of metal, a bit larger than a basketball, it's just the right size and shape to be manipulated by hand tools.

I wonder if that contributed to normalization of deviance. It doesn't look weird enough to kill you.


> "Four of the fatalities were just bad luck, involving a group of janitors who shared muscatel wine that was laced with antifreeze"

I'd like to hear a little more about the "bad luck" that caused a bottle of muscatel to get laced with antifreeze...


Small amounts of antifreeze can be used to make bad wine taste better, and kill you faster.

https://en.wikipedia.org/wiki/1985_diethylene_glycol_wine_sc...


More recently, Fireball whiskey was recalled in a few countries for high levels of propylene glycol. https://www.huffpost.com/entry/fireball-whiskey-recall_n_606... Note that it had the same levels in the USA but was not recalled because our standards are laxer. Still, it's unlikely to to kill you.


Propylene glycol is a food additive. It is also used in "safe" anti freeze. It is not the same as Ethylene glycol which is common automotive anti freeze and very toxic.


It's also one of the primary ingredients in most e-cigarette vape fluids too, and used in fog machines for the same reason.


This video "A Brief History of: The Demon Core" on the Plainly Difficult channel showed up in my YouTube recommendations just the other day:

https://youtu.be/VE8FnsnWz48


Love Alex's archival research work. Never seen a couple of those pictures or heard that particular account of the airglow from Schreiber before. I'm glad to see it finally falling out of favor to paint Slotin as some kind of 'Canadian hero' as it was fashionable to do 15-20 years ago. He was a fool who regularly courted fate, doing things like diving to the bottom of a cooling pool of an operating reactor to repair an instrument he didn't want to wait another day to do, giving himself a nice 100R dose.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: