This quote from the article sums up why the book resonated with me:
> Kuhn is a realist, in that he believes in some external, material reality beyond our language and cultural constraints, but he is simultaneously a relativist in that he has no access to nor can say anything definitive about that outside world independent of language and the conceptual categories that lead us to think this or that about external reality.
The gist of his thesis, if I recall correctly, is that scientists are indeed progressing towards a deeper understanding of the universe, but the process gets messy sometimes. When we hit an inflection point, and the current paradigm for a field (such as physics) breaks down, multiple theories emerge to restore the internal logic of the field, and the proponents of each camp compete with each other for dominance. It gets messy when some scientists, who have dedicated their lives to the old paradigm, are confronted with the idea that their paradigm is outdated, and therefore their life’s work is kinda outdated. Obviously you can look at it another way and say that we’re all in it together and their work led us to discover the limits of the paradigm, but the reality seems to be that this is a harsh reality to live through, and the old paradigm scientists get reactionary and sometimes actively obstruct the paradigm shift. But the truth eventually comes out when the new paradigm’s theories correctly explain the problems that the old paradigm couldn’t explain, and these new theories get backed by experimental evidence. I think Kuhn goes so far as to say that sometimes the old paradigm can only be laid to rest when its last proponents die.
Interpreting Kuhn is an art unto itself, but I find rather that he is saying progressed can't be measured and therefore can't be inferred. He wouldn't be much of a relativist otherwise! I'm not sure but I think he uses the word "incommensurability", which implies an absence of indicators of progress (if not him then certainly some interpretations of his book use that very word).
This is not to say that I agree, but it's the cold logic of it that makes him hard to refute. Rather, pragmatists like me have resolved to ignore him and enjoy our iPhones and lasers.
His critique is more of the way the history of science is presented (as an inevitable succession of triumphs on the way to a single objective truth, each building on the last), than a rejection of objective progress or the scientific method.
And of course Kuhn isn't opposed to the idea of progress, he's simply raising the question "what really is progress and how can we know". This was necessary at the time, but the trivialization of "advanced" technology has made his point outdated in my opinion (if still perfectly valid in a logical sense). It's almost unthinkable (to me) that Kuhn would've written his famous book in the current era.
There is this contingent of people who instead offer an interpretation of the book that it's somehow about proving that scientific progress doesn't happen (different from: we can't technically prove that it is, or measure it precisely).
My guess is that those are people with an axe to grind against the idea of reliable scientific progress to begin with, and perceive some kind of technical (philosophical) justification for their view in something Kuhn said in SoSR—despite the fact that main content of the book isn't even any sort of abstract philosophy, but practical comments on the social and psychological conditions of working scientists, backed by historical examples.
From memory it's pretty straightforward, short, and not a difficult read. The themes dealt with are timeless IMO, and just as relevant today. I'd encourage anyone who hasn't read it to spend a few hours doing so.
Kuhn is never bad.
The data is closest to reality, but without interpretation, we can't take useful action from it.
It's not even that one is right and the other is wrong.
Both can have right and wrong points we can adopt (even if either of them would want us to take all of their points wholesale).
1. Kuhn could have authored a critique (of any actual substance) of the way scientists regard the field as progressing, compared to the actual historical record on uptake of new theories & contests between competing theories.
2. Kuhn's criticisms become irrelevant or inapplicable because our society has created iphones and lasers.
If (2) is true, Kuhn surely wasn't saying anything of any gravity and (1) is false. Conversely, If (1) is true, (2) is hard to swallow.
I think many comments are missing the point though, people care about Kuhn because of his epistemological innovation and the implications of his work on the very idea of science, not because of its reception by science historians, however great it may have been.
An iPhone and an original IBM PC are essentially the same class of device. The iPhone is miniaturised, refined, and improved, but the principles of operation are recognisably similar. They both have similar technological roots.
You can have steady technological process without a revolution in the fundamentals - which was what Kuhn was interested in.
Fundamental research tends to become math before it becomes technology. Maxwell and Heaviside are more fundamental than the transistor amplifier, the dynamic memory cell, or the valve radio, and you don't get to have any of the latter without the former.
After an explosion of change in the late 19th and early 20th century, the pace of that kind of fundamental research has slowed right down. Existing models and techniques are being refined to create smaller and faster devices, but there have been no revolutions that could lead to new kinds of devices.
There are some prospects for invention - like quantum computing - but there doesn't seem to be any immediate likelihood of new insights on the scale of the revolutions created by electromagnetic theory, thermodynamics, evolution, the periodic table, relativity, quantum theory, and the standard model.
Familiar principles like Ohm's law stop working on small scale devices. Fundamental research in nanoscience was required to learn how things behave on such small scales. Some of that research led to Nobel Prizes and new kinds of devices like MEMS.
But we do seem to have reached a limit in physics; not the end, but the limit of what we can learn with the resources available to us. The revolutions we can look forward to in the near future (e.g better machine learning, implantation of 3D printed organs, safe and effective gene therapy, quantum computers with more qubits) are mostly about new ways of building up rather than learning down. Opportunities for engineers, not scientists.
But then again, a scientific revolution might strike without warning.
You are making a mistake that Kuhn tried to correct in the revised edition of SoSR published in 1969 where his choice of terminology sometimes had 2 senses, causing critics to focus on one sense when he intended the other.
In any case, you seem to use science and technology interchangeably which of course is wrong.
Reading Kuhn forced me to review what we collectively regard as science, engineering and technology. It turns out these are all different things!
From my notes on SoSR + review of Wikipedia articles, I distilled the following summaries as to how they differ:
Science: systematic body of knowledge of the physical world (observations, know-how)
Science can broadly be:
- applied (urgent solutions) or;
- theory (non-urgent solutions)
Engineering: applied sci. & tech (know-how & tools)
Technological progress is how new paradigms can et expressed. No laser disc without QM, not GPS without Relativity etc. and in some sense that's exactly his point.
The iPhone isn't truth, it's just one way to express our understanding of the scientific paradigms.
See also James Burke's "Connection" series, as well as his (amusingly named) "The Day the Universe Changed":
Well, Kuhn was concerned with science and epistemology, so not progress in the general sense if I'm not mistaken.
Technological progress (from simpler to more complex technologies) could hardly be refuted, even from Kuhn.
But whether iPhones and lasers are progress in the value sense, that (with a logic similar to Kuhn's thoughts on science) depends on our value framework. They're not irrefutable progress even for all of us in today's cultural climate (some e.g. consider them regressions regarding experiencing the world directly, intimacy with others, privacy, etc, and think we'd be better off without them, even if we lose portable social media apps, global internet access from our pocket, etc).
Even less so for other cultural contexts (and we don't even have to go to the past; would the Amish, for example, consider an iPhone "progress"?)
Or the shorthand version "Science advances one funeral at at a time."
I thought there were, famously, several dozen slightly different (incommensurable?) definitions (or interpretations) of "paradigm" alone in the text.
Popper's standards exclude sociology, psychology, theology, economics, and most of the "soft" sciences. This annoyed many people who claim to be doing science. Kuhn was willing to consider them science, but had to redefine science to make that work.
Falsifiable theories allow reliable predictions. So they lead to engineering, and stuff that works. Although Popper's position is currently unpopular, he wasn't wrong.
Richard Feynman called "soft science" a pseudoscience, which I completely agree with.
Humanities departments around the country essentially appropriated the good name of science at add more legitimacy to their "soft science". It's similar to how "christian science" or "scientology" appropriated the good name of science to legitimatize themselves in the eyes of the public. Creationists have even created creationism "science" to make themselves look more legitimate.
>Yes, Kuhn could be a bully, especially towards anyone who challenged him, such as Errol or our late colleague Harold Dorn. But for someone like myself, non-threatening and interested in science and the Enlightenment, Kuhn was kind and took his responsibilities seriously as a teacher
seems like an accurate description of the worst teachers I had in college, those I truely despised. Those who couldn't care less for an enthusiastic students while enjoying the superiority feelings of teaching the weakest students.
This reads like a serious case of Stockholm syndrome or an abused spouse.
As a student you realize that instructors aren't infallible and that you're not only free but expected to accept or reject what you're told on its merits. And, frankly, this is much easier to do when an instructor isn't equivocal--when they state plainly and with conviction their beliefs. If everybody is mature enough, it's implicitly understood that people can be and are wrong. Unfortunately, not everybody is mature enough.
This is true of leadership more generally. The job of a leader is to put themselves on the line. By being direct and forceful there's never any doubt about where responsibility for an idea lies, and there's no back peddling if they didn't equivocate. It's more difficult to say that an idea wasn't given a proper defense if the defender literally puts it all on the line; where there's no reason to doubt they didn't explore every reasonable criticism.
It's the same thing with writing. As an undergraduate I hated writing papers because the thought of committing to paper an idea I was unsure of seemed wrong. But that's not the point of writing; the point is to commit to paper the best defense you can think of at that very moment, and to do so without equivocation. The criticisms can and will come, as they should. If you equivocate, however, you're softening the blows of subsequent criticisms, which is a disservice to everybody, but especially yourself. Making yourself vulnerable to criticism is a very difficult thing to do.
Especially once you get into graduate and professional studies, the professor's job is present their ideas firmly. To do anything less is a disservice to students and the truth. This is easier for some people than others. People who are legitimately unsympathetic a-holes will, of course, have an easier time at being firm. You can usually differentiate the non a-holes as they're more likely to invest more time in thoughtful, personalized criticism (e.g. 1 1/2 pages of comments). But that takes tremendous effort and patience, and even the non a-holes don't always have the time or ability to do that. The best gift they can give you is serving themselves up as a punching bag.
cf Pai Mei in Kill Bill or Sufi marabouts and their talibés in Senegal.
This was all tied into Morris's views on epistemology more generally. He directed The Thin Blue Line, which was the first crime documentary with dramatic reenactment. Reenactment can be argued to be inauthentic, but he argued that it is no more problematic than taking someone's verbal account, as was common in documentaries at the time. In some sense by putting pictures to words the director can ensure that the victims account is misinterpreted as little as possible.
Wish I still had that essay. I'd love to know if I had magically discovered Kuhn's thesis independently at the age of 17 - or as is more likely - spouted some vagueness that resembled his ideas if you squinted enough...
There are few truly novel ideas, and yet too, few which truly emerge into public consciousness, and many of those are themselves wrong (or at least gravely flawed), or greatly misunderstood. Often because they've been misrepresented.
Keep on thinking and reading!
I like that.
(Though ... it's also got its pitfalls. And there's the whole notion of explicit vs. tacit knowledge and learning.)
Though the notion that bad theory, uninspected or unquestioned, can lead to bad practice.
Alchemists and astrologers were both exceptionally empirical. Vetruvius's De Archiectura is a fascinating mix of pragmatic advice and really shitty causal explanations.
Also on HN:
So Kuhn seems to believe that the biologists are all wrong when they claim our brains have been formed by evolution to look at reality in certain universal ways. Instead he seems to be taking the Cartesian position that human minds exist outside the material realm. Most philosophers today would disagree.
In backend development microservices really is revolutionary.
React/redux is equally revolutionary.
Changed everything. And then it holds stable for a bit while new ways of doing things percolate but not yet really take over.
Of course Sao is like microservices but it is very different than monoliths.
The biggest issue with scientific revolutions theory, like this discussion, is that you can break it down so that it doesn't seem like change, even when it is.
Again, it’s a great step forward, but largely a great step forward for a realm that was quite far behind. I don’t consider that to be revolutionary unless if you’re wearing blinders.