In the situations where Münchhausen trilemma is applicable, people tend to use a version of abductive reasoning, which is finding the simplest narrative or explanation that fits the known facts.
It's important to keep the Münchhausen trilemma and abductive reasoning in mind when discussion controversial issues where one side feels they have proof and the other side is ignorant or duped. Often the difference is something like whether emergent behavior or a conspiracy feels like a simpler explanation.
abductive reasoning, which is finding the simplest narrative or explanation that fits the known facts.
The problem there, is figuring out what "simplest" means. To many people, "God" seems like the simplest explanation for the universe, even though what the word implies is very complex. It's the same phenomenon as people who say, "Scientists can't explain magnets, the liars!" Magnets seem like a magical extra, but it turns out that electromagnetism is a fundamental. It's a rock sitting on a tabletop that's complex and turns out to need explanation.
We should teach abduction and Bayesian reasoning as part of a normal education. Epistemology overall is more of a collegiate/post-grad subject, but we have to provide some kind of foundation in epistemology. Otherwise we end up with a lot of kids who have tied their self-worth to knowing "science", but it's more of a slogan and catch-all phrase than something they can continue to build on. (Aside from the actual practice of whatever scientific field they end up in. Even then, we see in several fields epistemological problems rearing their heads -- and the leading practitioners clueless about what to do about it. Usually re-inventing the wheel in some ad-hoc fashion.)
Really good point on controversial issues. I think opposing sides of current hot topic issues (take your pick, gun control, abortion, pronouns) start with a different set of axioms. Axiomatic differences can lead of vastly different conclusions, both with sound logical reasoning.
Eg. If you take as axioms that it is always wrong to end life, and life begins at conception, abortion is wrong is a sound logical conclusion.
> Eg. If you take as axioms that it is always wrong to end life, and life begins at conception, abortion is wrong is a sound logical conclusion.
Do note that often in this kind of controversial topics, the axioms are not held consistently by a given side (or the actual axioms are hidden and differ from the stated ones). For example, "it's always wrong to end a life [except when bombing abortion clinics | except when it's a death row immate]". Then the ad hoc provisions start, "oh, I meant an innocent life" ("but how can a nonsentient being be called innocent in any meaningful way?"), etc, etc. The debate then becomes anything but grounded on well defined axioms.
It's not even that they aren't held consistently. Honestly, nobody (outside of philosophy papers) lists out their axioms and derives beliefs. They have beliefs and "reverse engineer" the supporting axioms, and don't particularly care if the axioms required for one position conflict with the ones required for another.
Very true. In fact, I don't think people even start out with "beliefs". They start out with feelings. The emotional response is the equivalent to an axiom, and is abstract and formless. They must use their life experiences to form a reasoning to justify why they feel a certain way, and that is their "belief". This justification is rarely entirely accurate, it's just an approximation filtered through their own biases and reasoning capabilities.
When I have an argument, I find it most productive to first seek out to understand how the other person feels. Peoples' feelings are rarely "incorrect", and usually align with my own feelings at some fundamental level, although perhaps with different intensities. I then try to work with them to put our brains and experiences together to find a more solid understanding for why we feel that way.
What is a belief but an axiom? A prior that must be assumed, as it cannot be proven within the system.
I will definitely concede that the logic people use on a day-to-day basis is much fuzzier than an analytic might prefer, and that there's much greater allowance for contradiction (see the phenomenon of Cognitive Dissonance). But I do think that most people argue from their own first principles, consciously or not. There are rare cases where you can convince someone of something when you discover and correct a logical flaw they have made. More often than not, though, a 'logical argument' is a veiled battle of dogma; both participants are moderately rational, and share the same rules of logic, but their axioms are so alien to eachother that what is plain as day to one is a violation of human decency to another. I'd reckon that in reality, it's rare that two people are actually even arguing about the same thing, even though they may be using some of the same words.
Agreed, I don't think people really debate emotionally charged issues from first principles. I think it's an error to think of this as a matter of logical axioms. That's not how people (usually) debate.
It's been out of favor since Derrida, but I think Structuralism is due for another pass in the limelight. An extreme subjectivism about reality, shaped by the stimuli you are exposed to. These external stimuli are what define your axioms, and from there, you use the same (fuzzy, bad, and prone to inducing contradctions) logic as everyone else. It provides a compelling model for why people have difficulty reconciling their opinions; they're not even talking about the same things, just using the same words, the same signs.
> "but how can a nonsentient being be called innocent in any meaningful way?"
This makes me imagine putting people through a trolley problem where you have a serial killer on one side and an 8 week old fetus [1] on the other. That said, many of the people I've known simply argue against the death penalty instead. This may be biased by knowing Catholics, though.
Well, the real mistake is believing that people think or behave in a consistent manner. All attempts at "rational" debate starts from this flawed premise, and thus always fails.
"All" and "always" are too strong. It's definitely possible to engage in rational debate with an intent to be consistent, and update one's own views when a conflict is discovered. It's just not universally done.
> Eg. If you take as axioms that it is always wrong to end life, and life begins at conception, abortion is wrong is a sound logical conclusion.
yes, and you'd also end up with an outright ban on civilian ownership of automatic weapons, amongst other safety measures. But you don't often see those two together. I think that the axioms are a layer or two below that, and might involve concepts like "individual responsibility" and "greater good"
Agreed; it seems likely that the axiom system of the typical pro-life defender[0] is closer to "It's always wrong to end an innocent life"; this requires the inclusion of a set of axioms defining innocence (and thus probably invoking some combination of free will, and an absolute moral axis. A convenient framework for this includes a soul). This way you preserve the ability to defend yourself, to take the life of an enemy soldier, etc, while still being able to justify a pro-life position. It also allows for the death sentence for murderers.
---
0. there are, of course, different sets of axioms that arrive at superficially similar conclusions. For example, the category of pacifist that believes that any loss of life is not justifiable; this kind of person will reject both abortion and ownership of weapons, war, etc.
It could also be that either side of these debates are unknowingly using abductive reasoning for how they feel about a stance that conflicts. Perhaps the proposed axioms come after to explain how people feel about controversial issues.
Funny enough, this conversation could also be a form of abductive reasoning. We're inferring that people are using abductive reasoning because people often hold conflicting moral views, though using abductive reasoning wouldn't necessarily guarantee conflicting views.
First, inductive reasoning is different than abductive reasoning.
Second, the goals of abductive reasoning are different than philosophical or mathematical proof. Often we need to make decisions on incomplete information, and it's a philosophy to help one do so.
For example, juries are supposed to make a decision based on a preponderance of evidence or beyond a reasonable doubt. Convictions would be rare if the standard were incontrovertible truth.
Of course, but the whole point of the trilemma is that absolute knowledge of absolute truth is impossible, not that we cannot axiomatically accept the utility of partial, approximate knowledge.
Also, while abductive reasoning is (a bit) different from inductive reasoning, they have the same underlying epistemological justification. In discussions of epistemology you often don't see abductive reasoning mentioned, as it falls under the general framework of inductive reasoning.
Well if absolute knowledge of absolute truth is impossible are you able to suggest a truth that could even be considered absolute? If not, bit of dangerous statement, isn't it?
In spirit, yes. However, it gets more interesting if we try to make Occam's Razor more precise with something like Solomonoff induction, then it get's trickier to decide in practice what "simpler" actually means.
Even if we're able to reduce our competing models to Turing machine implementations, we're only able to calculate their complexity relative to some defined collection of models, which in turn depends on what kind of low-level logical formalism we're using.
In practice, of course, we can't parse out all these formal details for every decision, but I think it's helpful to at least be aware of the inherent problems with naive Occam's Razor.
Occam's razor is better thought of as a method of choosing between alternative explanations of the same observed phenomena. It suggests choose the one that requires the least unproved theoretical scaffolding to support it. It is not really "accept the simplest explanation" but more like "do not create complex theories unless they explain the facts better than the simple ones."
To be clear this trilemma is only a problem for rationalists, like Hans Albert. This problem doesn't exist for German Idealists like Kant or Schelling or even the empiricists.
You escape the problem of infinite regression through acknowledging that the only thing real is experience, which lends you to the haven of relative objectivity, which is that of which modern science has become.
You would think that the conjurer of the paradox would have shifted to empiricism or idealism—maybe my faith in people betrays me.
> You would think that the conjurer of the paradox would have shifted to empiricism or idealism—maybe my faith in people betrays me.
What does this mean?
> Karl Popper argued that a way to avoid the trilemma was to use an intermediate approach incorporating some dogmatism, some infinite regress, and some perceptual experience.
> One example of an alternative is the fallibilism of Karl Popper and Hans Albert, accepting that certainty is impossible, but that it is best to get as close as possible to truth, while remembering our uncertainty.
The only difference I see between "This is true to the best of my knowledge and experience" and "This is true because to the best of my knowledge and experience, therefore it is certainly true" (empiricism) or how conceited the thinker is about their own infallibility. (https://en.wikipedia.org/wiki/Fallibilism)
No, Experience is forced upon you, it is neither an assumption or something discovered through reason.
Additionally, words like "real", "exists" or "reality" only have meaning in this way, otherwise it would be impossible to understand them since understanding is only through terms of experience.
However, it's important to note that idealists aren't naive realists, they acknowledge that what they see is created by their own sensory organs. That is, there is the not-object that cannot be discussed or learned about further than the knowledge obtained from the negative predicate of the object—hence why you can only see one side of a coin or why your experience can change through hallucinatory drugs.
The not-object is also necessary for communication between individuals, where the inter-subjective experience must be mapped to do so.
> Experience is forced upon you, it is neither an assumption or something discovered through reason.
Why? That claim must be justified by reference to other truths, or taken itself as an assumption. So doesn't the trilemma still apply, but at one more remove?
Note that 'assumption' does not necessarily mean an arbitrary statement, but includes statements that we consider self-evident, e.g. 'something exists'.
It doesn't have to be explained why, it is. The why would be a separate question requiring reasoning. Things can be true without an explanation as to why they happen. It's not an assumption as experience is not a choice, in fact experience is a prerequisite to then be able to assume (as well as to reason).
There is no possible reality in which the observation of anything can be explained by absolute nothingness. We know that existence exists, and that within that existence is consciousness capable of perceiving it. These are self-evident axioms upon observing anything at all. I think it's fair to say that we don't know why reality exists, but we know that it is true that something exists at all.
Yes, but so many tech people see themselves as rationalists or at least rational utilitarians, and I'm going to guess most of this forum does as well. Given that so many tech people see themselves this way, diving into the trilemma (and discussing how Idealists or empiricists moved past it) is important and probably eye-opening for many.
Let the set M = {circular, regressive, axiomatic} to be the set of "unsatisfying" arguments that may be used to prove any truth.
Let the mapping J: powerset(M) -> <schools-of-thought-in-meta-epistemology> to map some subset of M to a theory of justification such that the believers in said theory only find usage of some combination of the said subset to be "acceptable".
Then:
J({}) is in {Skepticism}
J({circular}) is in {Coherentism}
J({axiomatic}) is in {Foundationalism}
J({regressive}) is in {Infinitism}
J({axiomatic, circular}) is in {Foundherentism}
J(M) is in {Quietism}
Is there an x where x is in the powerset of M, such that J(x) is {}? Or literally every possible position can be the foundation of a philosophical paper?
Well, the regular formulation of "Infinitism" is that S is justified to believe P_1 on the basis of P_2 and P_n on the basis of P_n+1.
J({regressive, axiomatic}) just defines a limit i.e. S is justified to believe P_1 on the basis of P_2 and P_n on the basis of P_n+1 such that lim_(k->infinity) P_k = P_x where x is not a natural number.
You have to say x is not a natural number, because if it were, J's output would have been "foundationalism" and P_x then would be a "self-evident" belief - to use Chisolm's terminology.
P.S. If anyone thinks that this is an abuse of mathematics, I agree. But, its usage is compatible with that of philosophers e.g. Goldman's Causal theory of knowledge literally uses a recursive formulation of belief formation where the base case is "self-evident", I just unwrapped the tail call and wrote it as a loop!
To me, this is a great illustration that there's no other coherent place to start any form of discussion, reasoning, or belief than with axioms (or assumptions). Regressive and circular arguments only shift the problem.
I believe that if one is looking for truth in a big-picture sense (religious or philosophical), the best approach is to evaluate various existing systems of axioms/assumptions and apply a twofold test to each: (1) correspondence to reality and (2) internal consistency. It's a given that applying the first test requires stepping into the subjective, since one must honestly apply it to personal experience.
Combine it with Hume's Guillotine and you realize that all "should" conclusions really just start with moral "value" axioms.
So then you can make a distinction between people that reason correctly from values that are different than yours, and people that reason badly from their axioms.
And you can also make a distinction between philosophies that are based off of unprovable beliefs (which is fine), and philosophies that are anti-factual (which isn't). I like to define the former as religion and the latter as ignorance, but unfortunately there are a lot of the latter that claim it's religion.
I heavily disagree; axioms are never a good place to start, nor, historically, they ever were a starting place.
Working mathematicians do a lot of mathematics and constructions before stopping to look and say: "OK, we can distill this rich theory down to this small set of axioms".
Geometry has been quite developed before Euclid's axioms, and number theory took its sweet millenia before Peano's axioms were developed.
On a more modern side, a lot of topology has been produced before people decided to write down a list of axioms for homology, which all the different homology constructions satisfied.
The point is that understanding almost always precedes formal reasoning, language follows ideas, and rigid systems arise on a fertile soil of messy experiments - mental or physical.
If you look at the history of science and mathematics, nothing was ever built up from the formulas and axioms (criminally contrary to the way these subjects are taught).
There is an underappreciated beauty in our ability to argue about, reason with, and make use of concepts that we haven't even really defined. Everyone knew Newton's Calculus was full of holes, and people used it for centuries before coming up with solid definitions of limits, derivatives and integrals - the most fundamental concepts of the subject!
And on that note, people used real numbers for a long time before having the formal machinery to define them. Zeno didn't have the axioms to resolve his paradox. Didn't stop anyone from using these weird objects in reasoning and practice.
I'm talking about different sorts of axioms than those that science and math work towards.
For example, any discussion between 2 people depends on mutually accepting basic assumptions such as:
- I and you are persons. We exist.
- The words we speak and hear have meaning.
- The words you hear are the words I spoke, and vice versa.
We can't even have a meaningful, coherent discussion without assuming things like that.
We also can't live a meaningful, coherent life without basic assumptions about origin, meaning, morality, and destiny. Everyone has them, and unless we're careful, they're easy to catch like the cold from folks we hang around.
A coherent, truthful worldview is essential. The test for truth (correspondence to reality and internal consistency) does provide us a way of comparing religions and philosophies (and ruling out the obviously false ones). Once we do arrive at a coherent worldview that best corresponds to reality and is internally consistent, we have to assume it and derive our beliefs and practices from it without proof (i.e. faith). (BTW, we all have a worldview, but we haven't all subjected our worldview to the truth test.)
Everyone has faith: it's required to accept the axioms of one's worldview.
I wonder why they argue that way, when it seems obvious that you need some common ground to start reasoning at all.
As we all start living in this world without any knowledge, all knowledge has been formed by the things we experience. And therefore, many of us agree to argue based on our (measurable) experiences (e.g. scientists). Others prefer to add beliefs which are wildly subjective (hard to reproduce) and therefore cause a lot of inconsistencies.
But in the end, it always comes down to the point where you need a common set of truths to start an argument. Otherwise there is no point in arguing at all.
First we don't start from zero. If you would build a robot with all human senses but erased hard drive it wouldn't do anything, ever.
Second, while applied in RL, science might be labeled as "objective" and therefore be superior for, let's say, building a bridge. But on a very basal level, we're subjectively agreeing on what "scientific" is, or rather based on non-proofable paradigms, which is what the article is about
I found it simply fascinating when I learned that Blue Wildebeests are born knowing how to stand within minutes of their birth, and within a day can outrun adult hyenas:
"Extremely precocial species are called "superprecocial". Examples are the Megapode birds, which have full flight feathers and which, in some species, can fly on the same day they hatch from their eggs. Another example is the Blue Wildebeest, whose calves can stand within an average of six minutes from birth and walk within thirty minutes; they can outrun a hyena within a day."
Very interesting concept, thanks for providing this link!
We could also apply this terminology to startups: precocial startups are profitable and self-sustaining soon after their launch. Superprecocial startups would be so right away.
Aside from drug dealers there are few superprecocial businesses. The only reliable way to form such a business is to provide something everyone needs and nobody has, like water in the desert.
You can accept an axiom for the sake of arguemnt and do some reasoning from it without really agreeing with it. You can evaluate the form of an argument without necessarily thinking that it's true or false.
The Munchhausen trilemma explains why you need axioms at all. If you could put forward an argument that didn't need further support (interrupting the "ad infinitum"), then you wouldn't necessarily agree with the argument but you could at least accept it as a self-contained idea. But we can't even get that far. Proofs that aren't either circular or endless must rest on some other statements that we can't prove.
i believe that point is nicely encapsulated in gödel's incompleteness theorem (which is primarily about number systems, but analogous here nonetheless).
Those three options are not "equally" unsatisfactory. In fact, the axiomatic option is almost always entirely satisfactory. The marvel and utility of an axiomatic foundation cannot be categorically compared to the absurdity of regressing ad infinitum or going in circles.
Much like how theorists could take the axiom of choice after the Gödel incompleteness theorems.
Sure, it causes inconsistencies, but you can’t get anywhere without them.
It recently occurred to me that epistemology is essentially a hermeneutics of experience, which is to say: all of our non-trivial beliefs about ourselves and the world (including this very statement) are simply interpretations of what we perceive. This realization, while surely obvious to some, was very clarifying for me.
Edit: Judging from the downvote I got, I guess some people don't share my interpretation of epistemology. C'est la vie.
The question "why should claim X be trusted?" indeed runs into infinite regress and has no well-founded answer. But the modified question "why should claim X be trusted by creature Y?" can have a well-founded answer, based on facts about creature Y.
I find the concept of paradigms, as described by Kuhn, to be a much better model for truth and knowledge than axioms or series of "proofs" chained one after the other.
The problem with using math to address this is that it very subtly ends up begging the question. Math has an answer to the trilemma: Math is based on axioms. The question of whether math corresponds to anything in the real world is a separate question from the mathematics itself. Godel's theorem is thus based on mathematical axioms, but the trilemma questions the axioms themselves. No structure built on the axioms can prove the axiomatic formulation is correct. Godel's incompleteness is an important part of math history about how that was finally proved so hard that modern mathematics now deeply accepts that non-uniqueness of axioms, but again, this is all happening at a level too high to help us with the trilemma.
So no, the trilemma can not be solved or even particularly refined by feeding it to mathematics. "From whence mathematics?", it asks.
If you are going to play philosophical games with the trilemma, a better approach is simply to feed it to itself. If the trilemma is true, then we can't be confident that we've properly read it, because it cuts away the foundation we are using to communicate. If any of us have successfully extracted even a portion of understanding about the trilemma from the reading of this text, we must be sharing a certain very basic amount of rationality to get that far, even if we can't nail down where that rationality came from exactly. You could then proceed to say that this basic shared rationality is de facto usable as a foundation. But that still doesn't answer the question of where it comes from.
That's not a true trilemma. A trilemma is a "choose two out of three". Therefore a trilemma is always depicted as a triangle where the vertices are the things you would like to have and the edges are the possible combinations.
A trilemma is an extension of the concept of a dilemma, which is "choose one out of two" .
Example: [good, fast, cheap]: you can have good and fast, but it's going to be expensive. Or good and cheap, but it's going to take longer. Courtesy of Jason Kottke[0], some more simple trilemmas:
Elegant, documented, on time.
Privacy, accuracy, security.
Have fun, do good, stay out of trouble.
Study, socialize, sleep.
Diverse, free, equal.
Fast, efficient, useful.
Cheap, healthy, tasty.
Secure, usable, affordable.
Short, memorable, unique.
Cheap, light, strong.
----
Ultimately the generalization of a trilemma is a (3-)budget. I will show this.
An n-lemma is "choose (n-1) out of (n)". Geometrically/topologically the things you would want in a n-lemma corresponds to the vertices in an (n-1)-simplex. A dilemma is represented as a 1-simplex, i.e. a line segment with each alternative as one end. In this case alternatives are points and so are the choices; in the 3-case alternatives the choices are segments. I'll leave you to imagine 4 and 5-lemmas geometrically.
An n-simplex can be defined in a n-dimensional vector space as the region of points whose coordinates sum to 1. So for example a 2-simplex (a triangle) is the set of points (x,y,z) such that x+y+z = 1. In a n-lemma, only vertices are allowable choices -- i.e. x,y,z must be all either 0 or 1.
In an n-budget, we can have partial allocations -- rather than have to choose two out of study-socialize sleep (i.e. sleep deprivation, loneliness or failing classes), I can spend my finite time as P% study, Q% social, R% sleep. Note: we can still represent budgets as points inside (n-1)-simplices.
Of course, some trilemmas are not easily seen as extremizations of budgets. In some cases this will be because certain alternatives are fully binary (either you have floating exchange rates); in others, it's because we need to think a little more about the trade-offs (cheap, light, strong) -- we need to think harder about how lightness works against strongness.
---
We can next imagine how to model slightly more complicated decision structures as n-lemmas. For example, I might have to decide between being married or single (each with its delights) and single people might have to decide between intelligence, looks and agreeableness. (This is terribly sexist but presumably very beautiful people don't have the same pressures to be smart and carve a space for themselves; and people who are intelligent and hot are full of themselves. Bear with me please). In this case we've connected a 1-simplex (married or single) to a 2-simplex. This is known as a simplicial complex, and it turns out we can (roughly oversimplifying) rebuild topology from the ground up by chaining simplices like that.
So the choose-one-out-of-three "trilemma" in the OP isn't a trilemma at all. It's some decision structure that arises out of a complex of dilemmas. Possibly because every choice is connected to every choice, it even has the geometric contours of a triangle, but a hollow one, one where the segments are not connected. Therefore it doesn't generalize as a budget. What is the trade-off structure then? What is the finite thing that makes having everything impossible? What are these people talking about?
You're welcome to use the word "trilemma" however you prefer, but it is definitely not wrong to use it to mean "choose one out of three" as here. E.g., here is the OED's definition: "A situation, or (in Logic) a syllogism, of the nature of a dilemma n., but involving three alternatives instead of two." (The OED gives a number of citations, all of which appear to be pick-one-of-three rather than pick-two-of-three.)
The generalization to budgets seems to me to fit better with pick-one-of-three than with pick-two-of-three, too. You're picking p study, q social, r sleep -- and the restriction is p+q+r=1. Picking one out of three corresponds to points like (1,0,0) which are in that simplex. On the face of it, picking two out of three corresponds to points like (1,1,0) which are not. (You could take (1/2,1/2,0) if you like, but that's no longer an extremization; if that, why not (1/3,1/3,1/3)? "X, Y, Z, pick any three.")
Separately, I'm having trouble seeing how your married/single/smart/hot/nice thing actually has the structure of a simplicial complex. In a simplicial complex there are three ways (aha! a trilemma!) in which a 1-simplex can relate to a 2-simplex. They can simply be disjoint; it seems clear you don't intend that. They can share a vertex; that would mean identifying "married" (from the 1-simplex) with one of {smart,hot,nice} (from the 2-simplex), and again surely you don't want that. Or the 1-simplex can be a face of the 2-simplex, so that {single,married} correspond to two of {smart,hot,nice}. Again, surely you can't mean that. But what do you want? It seems like you want to identify one end of your 1-simplex with the whole of your 2-simplex, and the dimensions are wrong.
https://en.wikipedia.org/wiki/Abductive_reasoning
It's important to keep the Münchhausen trilemma and abductive reasoning in mind when discussion controversial issues where one side feels they have proof and the other side is ignorant or duped. Often the difference is something like whether emergent behavior or a conspiracy feels like a simpler explanation.