Semmelweis is a favorite "management science" topic; there's even a pop-psych phenomenon called the "Semmelweis Reflex"; the Wikipedia article on it recapitulates much of what Aaron wrote here. The Gladwell formula of using Semmelweis' personal narrative to articulate a frailty of human reasoning was employed to great effect in Ayres _Super Crunchers_; the Semmelweis section is, for instance, noted prominently in the NYT book review.
Aaron has oversimplified the Semmelweis story in some material ways:
* Semmelweis didn't institute "handwashing" in Vienna hospitals. Contrary to the conventional wisdom, which suggests that doctors in the 1840s were sticking horse-manure-covered hands into the exposed wounds of patients, handwashing was apparently already a norm. What Semmelweis did differently was to use lime to wash hands.
* Semmelweis' actual theory of the cause of childbed fever was wrong, and it was wrong in ways that made his recommendations hard to take seriously. Semmelweis' contention was that "cadaveric particles" were making their way into patients, and that those particles could only be removed by lime. But doctors observed cases in which no contact with either cadavers or injected or symptomatic patients lead to the same cluster of illnesses. It was thus difficult for Semmelweis to make a "scientific" case for why the lime worked; it obviously didn't help that he was wrong about why it did (his work predates the germ theory of disease, which would have taught him that rather than lime being effective at removing specific particles, it was instead effective at killing bacteria).
* Aaron's story (and Ayre's) has a heroic Semmelweis pleading for doctors to simply wash their hands in a specific way to save lives. But that's not necessarily what Semmelweis was arguing. Instead, the case he could have been making, loudly, was for an actual, specific, incorrect cause of childbed fever.
* Semmelweis himself was, apparently long before he lost his post, a notorious asshole. It did not help his cause that instead of carefully reasoning about the actual evidence, he instead seized on a single explanatory theory of childbed fever and then demanded (often by barging into hospital wards and berating the staff) that his peers adhere to it.
The point is not that Semmelweis didn't make an important discovery, or that we shouldn't be mindful of warped-sounding new knowledge that contradicts our existing theories. Of course we should be objective when considering facts that threaten our existing theories. But there's a reason John Snow and Joseph Lister [and Pasteur] are better known in the development of the germ theory of disease, and there's more to learn from the Semmelweis story than how the audience to a new theory should behave.
I actually first learned about Semmelweis as an ethical case study. IIRC, Semmelweis did a formal experiment with a control group: only some of the doctors in the First Clinic washed their hands with chlorinated lime; the others deliberately went directly from handling cadavers to delivering babies. The question was whether killing a few innocent women and children in the course of the experiment was justified by all those that would be saved if he proved that washing with lime reduced the death rate.
Maybe a better illustration would be Peter Pronovost's efforts to get hospitals to use checklists. His studies show that using a checklist for routine procedures like inserting a catheter dramatically reduces infection rates, saving lives and money. It's such a huge win that you'd think checklists would take the medical world by storm, but that hasn't happened. Convincing doctors to use a checklist requires overcoming their self-image of competence; it's too easy for a doctor to think, "I know I have to wash my hands before inserting a catheter, I don't need some nurse with a checklist to remind me."
Of course, Pronovost's story is less dramatic than Semmelweis. Pronovost has been more carefully about drawing conclusions from his observations, done a better job of presenting his ideas, and been more successful—hospitals are adopting his methods, if slowly and sometimes grudgingly. He's not likely to die alone in an asylum. To my mind, though, that makes a better illustration of Aaron's thesis.
>It was thus difficult for Semmelweis to make a "scientific" case for why the lime worked
Aren't you supposed to regard empiricism above all else in science? While holding to that rule would force Semmelweis to alter his theory as well, the observation of the highly reduced mortality rates should have forced the medical powers that be to reconsider things also - the burden is on them to make a scientific case for why lime had nothing to do with the decreased mortality rates.
I agree there were probably other reasons they fired him, but I think that only helps Aaron's points.
It is not evident that Semmelweis himself regarded empiricism above all else. Rather, it seems like Semmelweis made an empirical observation (death rates plummeted when attendants washed their hands in lime), but then jumped to a conclusion (lime was removing cadaverine particles) and fixated on that conclusion instead of of the observation. Convert the narrative (lossily) to modern science, and imagine someone far more focused on their journal article than on saving lives.
Semmelweis went "on tilt" with his hypothesis. Even after doctors adopted a regime of disinfecting hand washes, hospitals still saw a significant rate of childbed fever. Semmelweis demanded of the scientific establishment that they recognize cadaverine tissue as the cause, going so far as to suggest that tissues in the mother were occasionally being crushed during childbirth, and later becoming gangrene, and thus mothers were infecting themselves.
Again: my point isn't that Semmelweis didn't make an important discovery, or that the scientific establishment of the time didn't miss a critically important opportunity; my point is that there is more to the story than the missed opportunity of Semmelweis' detractors.
I don't much care about the injustice of Semmelweis losing his post at a hospital in the 1840s. I am, on the other hand, fascinated by how poor framing and communication, close-mindedness, and overall bloody-mindedness prevented Semmelweis himself from becoming the godfather of the germ theory of medicine.
This added context really drives the OP point's home.
After all, it's relatively easy to be objective when the facts are overwhelming and communicated by a sensible and rational messenger. Under these circumstances, only the most pig headed caricature would cling to preconceived, faulty, notions.
However, it's much harder to take seriously criticism that originates from a volatile and unpopular source. Doubly so if their own theory or notion is flawed. Under this circumstance, it's extremely easy to avoid self-observation.
For instance, in this case it should not have mattered too much that Semmelweis's theory was demonstrably incorrect. What should have mattered was the clear causality linking disinfectant hand washing and a reduced rate of labor mortality. They had an easy out, and they took it. That reaction is all too common.
The doctors of Semmelweis' day definitely missed an opportunity. (Note that it was contended that chlorinated hand washes were common in England at the same time.)
That he named the things that were killed by lime "cadaveric particles" and not "bacteriae" at the time nobody knew that bacteriae exist can't be considered wrong.
How he named them was irrelevant. His explanation was good enough -- there WAS something on the hands of the doctors that was small enough and not visible that lime was able to destroy. Why was it hard to take it seriously then? Certainly not because others knew better -- nobody had "bacteria" in their language. He had to call that what was neutralized somehow.
You're rationalizing what Semmelweis said with a modern understanding of medicine. Of course, today, it's obvious that thoroughly clean hands help eradicate pathogens, and so it seems obvious that to note in 1840 that handwashing in chlorinated lime lowers death rates is to come immediately to the crux of the problem.
In fact, doctors in the 1840s were well aware of the concept of contaminants. They had already assumed a regime of handwashing. Moreover, Semmelweis himself was not content to lobby attendants to wash their hands. He was instead fixated on the idea of cadaverine particles, going so far as to invent new vectors for their creation in cases where no contact with dead bodies could have occurred. Semmelweis wasn't even correct about the mechanism of action in cleaning hands; he believed the chlorinated lime more thoroughly removed particles, when in fact the key was to kill the pathogens.
A simple way to sum the problem up: Semmelweis advocated handwashing... for staff who had been conducting autopsies. Semmelweis was on to something, but he himself seems to have missed it by a wide margin. Things could have been different if Semmelweis himself stuck with the evidence, rather than seizing the first bit of it that confirmed his theory and running away with it.
I hope you recognize that when you say "A simple way to sum the problem up: Semmelweis advocated handwashing... for staff who had been conducting autopsies" you also confirm that there were actually the doctors that did autopsies who didn't disinfect their hands. So he was obviously right. You can just claim that he set his goals too narrow, but even that much was not accepted by others.
Not at all. Semmelweis thought something associated with death was what was killing people (eg, he chose chlorinated limes which he found best removed the stink of death).
I suspect the focus on death and dead people meant people focused on that, and "proved" to themselves he was "wrong".
I can imagine scenarios where doctors dealt with one woman who had a good birth experience, didn't use the Semmelweis handwashing method (because the first woman didn't have the death particles) and then the next woman got infected and died. To many (unfortunately), that would prove his theory was wrong.
Citation needed, since you claim that doctors actually used the results of experiments to disprove him but at the same time didn't heed to the results of the experiments that showed that using lime was obviously beneficial.
No I didn't. I was very careful to make clear this was my opinion only ("I suspect" and "I can imagine scenarios").
As I noted, lime wasn't obviously beneficial, because Semmelweis claimed it removed the "cadaveric particles", and yet people were still dying when there should have been no "cadaveric particles" around (ie, no one had died).
It's like the story of how scurvy started happening again in the early 20th century with Scott in the Antarctic[1]. In the 18th century scurvy had been defeated by drinking (fresh) lime juice on long sea voyages, without a correct understanding of the mechanisms involved.
I think you've mistaken the point of my comment; you appear to think I'm sticking up for 1840s surgical hygiene. If you read my comment all the way through, you'll see that that's not at all my point.
The error was: instead of making a thesis to prove there's was a correlation, he instead made a thesis trying to explain (poorly) why it happened.
His peers, instead of investigating further, chose to just disregard it completely, and keep killing women.
This history is a perfect example about how being correct doesn't mean you're right. It's the difference between being logic or wise. Sadly, I see this reaction all too often.
All knowledge in the world is worth nothing if people can't reason, use intuition, be empathic and remove the ego out of the equation. Western-culture prides itself about scientifical feats and economical progress, but still has much to evolve about developing human beings. That was in 1800, but can easily happen today - just look at global warming theory.
It's not about drawing conclusion from intuition - you need to validate your hypothesis - but about following your intuition.
See Einstein: his findings started as hunches and creative exercises. Nobody though about light being a particle-wave, or combining space and time, because they were limited to preconceived notions ("it's either a particle or a wave", "time as conceived by Newton").
It's often necessary to start from intuition to get to new places.
That's a good lesson too, but the problem wasn't just that Semmelweis was an asshole. It is admittedly hard to come up with a coherent lesson from the narrative Swartz provided here, except to confirm truism ("don't be an asshole", "be open minded", "listen to the evidence, especially when it contracts you") that we obviously already know.
I'm sure that Semmelweis's bumbling associates would have agreed with you. But the fact remains that empirically he was saving lives; there was more truth in Semmelweis's claims than in theirs, and it was a matter of life and death.
While overall, I like this article, there is real danger in advice like this:
>Look up, not down. [...] to do that you need to look at the people who are even better than you.
While this is great for those who exaggerate their skills, I doubt those would even read this piece. For people with self-esteem problems and a tendency towards depression, however, this is about the worst advice you can give, because they tend to look at the top 1% already and therefore experience their life as a complete failure.
>But people will feel more comfortable telling you the truth if you start by criticizing yourself, showing them that it’s OK.
In theory this is fine, but if you start seriously criticizing yourself in front of others, you anchor this critique in the minds of your listeners.
Cordelia Fine, in "A Mind of Its Own," makes a similar point:
Fine says, "that your unconscious is smarter than you, faster than you, and more powerful than you. It may even control you. You will never know all of its secrets." So what to do? Begin with self-awareness, Fine says, then manage the distortions as best one can. We owe it to ourselves "to lessen the harmful effects of the brain’s various shams," she adds, while admitting that applying this lesson to others is easier than to oneself. Ironically, one category of persons shows that it is possible to view life through a clearer lens. "Their self-perceptions are more balanced, they assign responsibility for success and failure more even-handedly, and their predictions for the future are more realistic. These people are living testimony to the dangers of self-knowledge," Fine asserts. "They are the clinically depressed."
Is this merely a correlation, or are you (or Fine) attempting to posit causation here? That is to say, is this merely pointing out that viewing "life through a clearer lens" is seen in connection with higher frequency of clinical depression, or is the inference here that this clearer lens causes clinical depression?
I find it a hard case to make the latter statement, and the former statement doesn't imply (to me) that one ought to eschew the clearer lens. I wonder if there is some analysis that digs into what mental processes, emotions, thoughts, etc. occur in the wake of viewing the world through the clearer lens that might lead to clinical depression?
I feel like there could be some not-so-weak element of not having as balanced a view/method of how to respond to one's failures that would lead toward depressive thought patterns. I don't think self-esteem issues are caused by viewing oneself more objectively and rationally, but instead by placing unwarranted weight on the (potentially errant) conclusions one draws from a more objective view of oneself. That is, if one has a more balanced and nuanced objective view of the world, but has an equally unbalanced and un-nuanced view of oneself--say, that one's abilities or potential for improvement are strictly circumscribed and not easily changed--I would hedge my bets toward that person having a greater likelihood for depression than someone who does not draw such conclusions.
I think I tend toward viewing myself and others fairly objectively, and actively work to do so. I also compare myself to others I subjectively and objectively find better in some ways. But then I respond by improving the parts of me that I have found to be weaker than those who are a level above my own. I don't experience depressive thoughts (that I am aware of). (shit, that's just anecdotal and can be dismissed. nevermind.)
[edit: typo and calling out my own anecdotal evidence that is unhelpful]
The theory that depressed people have more realistic perceptions of the world is interesting, and apparently popular enough now for authors to wax poetic about it, but the only actual evidence I've heard of can be just as easily explained by the experimental design as by the theory.
> For people with self-esteem problems and a tendency towards depression, however, this is about the worst advice you can give, because they tend to look at the top 1% already and therefore experience their life as a complete failure.
And yet that piece of advice is situated quite firmly in the midst of other practices that are intended to help balance drawing such erroneous conclusions--reverse your projects, take the outside view, etc.
These bits of advice really work in tandem with one another. They're not a "if you could do just one thing, pick something from this option list" set of prescriptions. Reversing your projections includes reversing your projections about yourself, as well. Especially when they are non-objective projections. How do you find these non-objective projections? By taking the outside view of yourself. How can you possibly get toward a better objective view of yourself from the outside? By criticizing yourself for your non-objective conclusions, finding honest friends, etc.
> In theory this is fine, but if you start seriously criticizing yourself in front of others, you anchor this critique in the minds of your listeners.
This strikes me as a relatively dubious assertion. Is there some citation that shows this happens with significant frequency to be wary of it? I've never experienced such anchoring (consciously) toward others who are self-criticizing (in an objective, non-deprecating way).
I would really know what his objective analysis of the situation is unless he's already done a post that I've missed or if he can't talk about a court case.
I'm fairly sure that the federal case against Aaron is still being litigated (http://en.wikipedia.org/wiki/Aaron_Swartz#JSTOR), which means that he almost certainly can't discuss details publicly.
Personally, I enjoy his writing and feel like it's none of my business how the JSTOR case is going.
It does occur to me, though, that a post of his about legitimate self-critique might actually be somewhat related to regrettable past deeds such as you mention.
I've always told people that many of the best conversations with folks have started out with "Chuck I think you are wrong and here's why ..." This is especially useful when you are "important" (like someone's boss) because getting honest feedback when you are the boss can be nearly impossible at times.
Steve Bourne (yes that Bourne) told me once that you should try to cultivate people who could give you a different view on the world, That is helpful stuff but you do have to also get people to be honest. That is hard to do if they think you're 'hot headed' or likely to shoot the messenger.
saying “You were right, I was wrong.” It didn’t destroy her reputation; it rescued it. [...] Wayne Hale took full responsibility: “The bottom line is that I failed to understand what I was being told…I am guilty of allowing Columbia to crash.” He was promoted. When JFK admitted the responsibility for the Bay of Pigs fiasco was “mine, and mine alone,” his poll numbers soared
I would guess that the survivorship bias is at play here. How many people we will never hear about took responsibility and were demoted, fired or prosecuted?
Not only that, but Aaron seems to be skating perilously close to the position that admitting that you screwed up is, in itself, sufficient reason for you to be promoted. It's not.
I think the biggest challenge of this is finding friends that will tell you when you are wrong. And no wonder, it's really hard.
A friend of mine sent me a short story that they wrote not to long ago, and it was not very good. But I could not bring my self to say that, instead I made some little criticism and did not comment on the writing as a whole.
The tricky aspect here is that most people believe that taste, and the 'quality' of artistic work, is mostly a subjective matter, but when they make criticism of art they treat their views as objective truth. Thus when you criticise a friend's work, or vice versa, the temptation is to think of it as "(Your|their) work sucks!" instead of "Oh well, (I|you) didn't like it, maybe someone else will."
Semmelweiss, on the other hand, had no subjective matter. He had the statistics showing that as soon as doctors started washing their hands, infant mortality went way down. It was just the doctors' pride which made them think he was accusing them of regularly killing newborn babies.
Of course there is bad art which everyone agrees is terrible, and telling people that truth is hard. In the end, though, it's better to be honest because it's probable that your friend would respond with a determination to do better next time. The trick is to be tactful.
I'm not overly familiar with this story, but it is easy to recast as a fight between the empiricist and the theoretician. To the empiricist, a man shows up with statistics showing a correlation, maybe even causality, and that's all you need. To the theoretician, if the man with the statistics says it's due to little green men that live in his head, the numbers aren't enough.
In hindsight, we can see that he was right and it doesn't matter why. But there are also plenty of other stories where acting too soon, before having a real, theoretical understanding of what's going on, would have been a lot more detrimental in the long run.
Could not agree more. I have design based startup where we do digital home staging for real estate clients and this is our most difficult problem, getting past the subjectivity of interior design.
It's funny to see the exact same furniture used for various different clients and see the wide range of feedback we get from "I love it" to "that looks terrible" and usually the negative feedback is always expressed as definitive comments like "that does not work" or "this NEEDS to be changed". Someone recently even went so far as to say that a mug should be next to a coffee machine instead of a cup.
Even though we state that we are appealing to the majority of home buyers and not individual preferences I have yet to find a good way to get past design subjectivity. I think it is just human nature to inject our preferences into artistic things and express them as if it is the other person's work that sucks.
>It's easy to criticise a minor detail of a friend's work, but harder to say "you don't have the talent for this, give up."
I would question whether most people honestly have the expertise to make a judgement like 'you don't have the talent for this, give up'. I think people routinely overestimate their own abilities and discernment. Taking a hasty/faulty analysis and then telling someone to 'give up' seems like a terrible idea.
Curious, how do you know your friend is wrong? Was your friend factually wrong (a Holocaust denier) or did you just not like their writing?
With that said, I've critiqued writing I've not liked, but I rarely say, "I didn't like this", but rather I point out everything I think needs improvement, with at least one specific example to back up my point. As giving vapid criticism is easy, and producing works is typically much harder, I attemtp to put some effort into my feedback.
And always be aware that there's a good chance that what you might view as an improvement may not be. I'm no more offended that one doesn't incorporate a given piece of my feedback than I'd expect them to be in getting the feedback.
The best feedback I've gotten is typically tough but fair. Some characteristic elements I appreciate help with:
- Benchmarking (to a peer group)
- Clarity (what does/does not come across)
- Technical (pro-tip: this is how to fix)
Of course, some of the onus is on the submitter on how s/he selects the reviewer. This will be a function of how serious they are, and what their purpose is.
An expert reviewer will typically benchmark you to an attainable but more advanced peer group. A lay-reviewer is usually useful for feedback on clarity, direction, or purpose. If something is truly lacking, a reviewer may ask about the intended goals of the project. This typically segues to a discussion on a sub-section of the work that has some merit (hopefully) . Maybe an idea, concept, example. This is still useful, and the reviewer/s avoid painting the whole thing with a broad brush.
Edit: Just wanted elaborate, re: Benchmarking.
For creative work, typically, an exchange is "This part is strong. This part needs work." And the you are referred to somebody elses work:..."Take a close look at what these guys are doing [list XYZ]". It then falls on the reviewee to follow up. And this is where the real feedback takes place. Where you see just what is expected of you and you get a sense of how hard/easy the next step is. But the act of direction, support, and (hopefully) inspiration is what leads to the success.
* Critique the story, not the writer.
* Remember that it's your *opinion*, and phrase it as such.
So, for example, if your friend's story had terrible dialog, you might say, "I felt like the dialog was wooden. It seemed to me like Joe Maincharacter used a lot of clichés -- maybe you did this intentionally for effect, but I found it tedious after a while."
This, of course, assumes that your friend specifically asked for a critique when sending you the story.
I wish I knew. I have trouble with this with even my closest friends. I have a difficult time saying "you know, this girl probably isn't right for you" which after they break up they say that they wish I'd have said something.
The only thing I can think of is to make a concerted effort to be honest. Not too honest. I have noticed that if you're too honest it comes off as rude and abrasive even if it's true. Hopefully that will encourage my friends to be honest with me which is the kind of honesty I value the most from all the people I know in life.
While this, on its face, looks like good, morally sound advice, I worry about the practicality of it for a certain class of individuals.
The people I see in leadership positions are far more likely to be narcissistic sociopaths than reflective mediators.
If you're not in a position of power / individual freedom, this is excellent advice to make sure that you can, and are seen to, play nice with others. It will probably help your blood pressure and promotion prospects. I'm just not convinced that this strategy is one that leads to the top more often than brutal myopia and conceit.
I don't think he meant "top" meaning (necessarily) leaders of a hierarchy. I think it meant more like look to people who are performing "better" than you -- for whatever thing you want to measure.
Apparently the idea that diseases can spread through contagion is a prime example of something that was previously considered to be magic that later came to be accepted by science.
Sympathetic magic was one of the first forays into actual science. It sought explanatory power for phenomena. That they did this ass-backwards and propagated through superstition and rumor is why it's not actually science, but when you start talking about controllable and manipulable mechanisms, you've taken the first important step.
"Magical thinking" doesn't refer to this. It refers to an utter ignorance of the mechanism, leaping from observation to conclusion without isolating any variables experimentally.
>"people will feel more comfortable telling you the truth if you start by criticizing yourself, showing them that it’s OK."
I disagree with this assertion. Some people criticize themselves in hopes that their listeners will contradict them. For these self-criticizers, affirming their critical statements greatly upset them. I have not found much correlation in my personal life between people who criticize themselves and their ability to listen to criticism. There are other attributes which I have found correlations, such as being easygoing and thoughtful. I still often try to gauge a person's willingness to listen to criticism by starting with very small criticisms and working my way up.
This is what we’re taught: make five compliments for every criticism, sandwich negative feedback with positive feedback on each side, the most important thing is to keep up someone’s self-esteem.
But, as Semmelweis showed, this is a dangerous habit. Sure, it’s awful to hear you’re killing people—but it’s way worse to keep on killing people!
Semmelweis did exactly what you're advocating and he was marginalized for it. Maybe he would have fared better if he was more conscientious of their feelings?
That was what I hated about my previous work. Damn management was so protective that you could feel good about yourself, so there was no negative feedback and there was no way you could improve. I think they wanted us to work for them for low wage and never get better job. But in the matter of facts we sucked badly at that job, I felt this, and I felt it when I was going to interviews, I got new job and we'll see how it will turn out.
The thing about new ideas being rejected (better/different hand washing) reminds me of how everyone thinks that a web page or web application UI must to be hand coded.
I think that is very stupid and eventually we will use graphical tools to create the UIs for web pages and web applications and look back at the days when every single web page had to be hand coded in HTML and CSS and laugh.
I'm always up for an Ignaz Semmelweis [1] reference. I read his story for the first time when my woman was taking a Microbiology course. Incredible stuff.
edit: Specifically I believe I recall this (from the linked Wikipedia page) in Thinking Fast and Slow:
One of the first and most classic examples of effort justification is Aronson and Mills's study.[2] A group of young women who volunteered to join a discussion group on the topic "Psychology of Sex" were asked to do a small reading test to make sure they were not too embarrassed to talk about sexual-related topics with others. The mild-embarrassment condition subjects were asked to read aloud a list of sex-related words such as "prostitute" or "virgin". The severe-embarrassment condition subjects were asked to read aloud a list of highly sexual words (e.g. "fuck", "cock") and to read two vivid descriptions of sexual activity taken from contemporary novels. All subjects then listened to a recording of a discussion about "Sexual Behavior in Animals" which was dull and unappealing. When asked to rate the group and its members, control and mild-embarrassment groups did not differ, but the severe-embarrassment group's ratings were significantly higher. This group, whose initiation process was more difficult (embarrassment = effort), had to increase their subjective value of the discussion group to resolve the dissonance.
The cited study is: Aronson, E., & Mills, J. (1959) The effect of severity of initiation on liking for a group. Journal of Abnormal and Social Psychology ,59, 177-181.
There are so many studies like this it is hard to say which one specifically he may be referring to. There is a mountain of literature on the topic; as usual, Wikipedia is a good place to start: http://en.wikipedia.org/wiki/Cognitive_dissonance
It frustrates me a little when people conflate cognitive dissonance with effects caused by dissonance reduction, like in this article. Cognitive dissonance is simply the uncomfortable experience of believing two conflicting things. Dissonance reduction is the class of biases that we use to reduce that discomfort.
Semi OT, but I signed up for email alerts at the bottom of the previous article in this series, and haven't received an email since. Am I alone with this?
These posts have been reminding me of Malcom Gladwell, in the unfortunate way: an collection of interesting anecdotes that together would make an interesting story instead are blown up into a broader statement that doesn't seem to follow.
"It has been contended that Semmelweis could have had an even greater impact if he had managed to communicate his findings more effectively and avoid antagonising the medical establishment, even given the opposition from entrenched viewpoints."
> It has been contended that Semmelweis could have had an even greater impact if he had managed to communicate his findings more effectively
This is the point. Anyone who is properly introspective is already following the good advice given here. But how about the total ramrods one has to deal with on occasion, those people who are crazy and deluded, whose self-confidence totally exceeds their grasp of the subject matter? Blessed is the man who can say to his line manager "I won't deal with N.N. any longer, he is incompetent and unaware of it." But sometimes the constraints of work do not permit not dealing with some people, or the boss is the crazy one. What to do in such situations?
Let's see, the previous post is based on published psychological research, and this post points out a failing of humans that is, dare I say it, obvious to anyone that has critically observed any person, ever.
He isn't exactly making controversial points here, but rather trying to raise the awareness of such ideas.
He also isn't trying to sell you a pop-intellectual book.
Agreed, I enjoyed "Believe You Can Change" when it was posted on HN and was happy to find out this article was next in line in his "Raw Nerve" series. Thanks and can't wait for the next one
Aaron has oversimplified the Semmelweis story in some material ways:
* Semmelweis didn't institute "handwashing" in Vienna hospitals. Contrary to the conventional wisdom, which suggests that doctors in the 1840s were sticking horse-manure-covered hands into the exposed wounds of patients, handwashing was apparently already a norm. What Semmelweis did differently was to use lime to wash hands.
* Semmelweis' actual theory of the cause of childbed fever was wrong, and it was wrong in ways that made his recommendations hard to take seriously. Semmelweis' contention was that "cadaveric particles" were making their way into patients, and that those particles could only be removed by lime. But doctors observed cases in which no contact with either cadavers or injected or symptomatic patients lead to the same cluster of illnesses. It was thus difficult for Semmelweis to make a "scientific" case for why the lime worked; it obviously didn't help that he was wrong about why it did (his work predates the germ theory of disease, which would have taught him that rather than lime being effective at removing specific particles, it was instead effective at killing bacteria).
* Aaron's story (and Ayre's) has a heroic Semmelweis pleading for doctors to simply wash their hands in a specific way to save lives. But that's not necessarily what Semmelweis was arguing. Instead, the case he could have been making, loudly, was for an actual, specific, incorrect cause of childbed fever.
* Semmelweis himself was, apparently long before he lost his post, a notorious asshole. It did not help his cause that instead of carefully reasoning about the actual evidence, he instead seized on a single explanatory theory of childbed fever and then demanded (often by barging into hospital wards and berating the staff) that his peers adhere to it.
The point is not that Semmelweis didn't make an important discovery, or that we shouldn't be mindful of warped-sounding new knowledge that contradicts our existing theories. Of course we should be objective when considering facts that threaten our existing theories. But there's a reason John Snow and Joseph Lister [and Pasteur] are better known in the development of the germ theory of disease, and there's more to learn from the Semmelweis story than how the audience to a new theory should behave.