Hacker News new | past | comments | ask | show | jobs | submit login
Two Envelopes Problem (wikipedia.org)
112 points by atomon on Aug 6, 2010 | hide | past | favorite | 88 comments



Isn't the expected value of both envelopes, prior to picking either, 0.5 * X, where X is the sum of the envelopes' value?

So one way to argue this is a fallacious problem is that you can't take an unknown (the actual value of the envelope that you picked) and pretend it's a known. The value of the envelope you are holding, prior to opening it, is 0.5 * X. The value of the other envelope is also (0.5 * (.66 * X)) + (0.5 * (.33 * X)) == 0.5 * X.

This may be too ill-expressed -- but there's something wrong with treating the value of an unopened envelope as anything other than completely probabilistic.

Put differently, don't you have to say

0.5 chance I picked the bigger envelope (call that A). In which case, the other envelope is 1/2A. So if I switch, I lose 1/2A.

0.5 chance I picked the smaller envelope (A/2). So if I switch, I gain 1/2A.

Which means if I switch, half the time, I gain 1/2A and half the time I lose 1/2A, for an expected gain of 0 from switching.


That is the right answer. The flaw is that the "A"s in the formula are different actual values depending on which envelope you have chosen, but the analysis has them as the same.

In other words the paradoxical analysis is done relatively, which is wrong. You have to analyze probability from an absolute frame of reference.


Exactly, A doesn't equal A for all parts of the formula, so it's silly to add it up.


Well, of course you're right, as is yanowitz. But there is something unsatisfying about this explanation. Intuitively it is obvious, but why is the relative reasoning incorrect?

There is no formal distinction between absolute and relative quantities, and no theorem that says that expected values can only be taken from absolute quantities. There are just random variables, and these have some distributions, and they can be independent or not. Nothing prevents you from taking an expectation of a random variable that is a ratio of two other random variables.

Another angle - I could say that our definition of expected value, based on weighted arithmetic average, is completely arbitrary, and instead define my own expected value G[X], based on the geometric average. Suddenly, the relative approach becomes correct: sqrt(2 * 0.5) = 1, so the expected relative improvement from switching is 1. What the hell is going on?


> Intuitively it is obvious, but why is the relative reasoning incorrect?

Because in the reasoning symbol A is used to denote "expected value of a random variable representing the amount in the envelope you picked" but later on "expected value of a random variable representing the amount in the envelope you picked provided that you picked envelope with more money" and "expected value of a random variable representing the amount in the envelope you picked provided that you picked envelope with less money".

Error comes from splitting reasoning for two cases and failing to factor in the condition on which you split in your further calculation of the cases.

If you solve some equation and you have to split your reasoning in two (or more) cases, the while reasoning those cases you have to remember the condition that you assumed for given case and factor it in (possibly toss away some solutions).

I'd like to see some day less one dimensional way of writing down mathematic reasoning so one can see how information flows through through the course of a proof and errors like this would show up more easily.


OK, now I see there are two slightly different possible formalizations of the problem:

1. You are told that the two envelopes contain amounts A and 2A, but you aren't told what A is. After you pick one envelope, you are allowed to open it, and then you're given the choice to switch. Here the optimal move depends on the distribution of A, and if you don't know it, you can't do much other than pick randomly. After some googling, this is the more common formalization, and it is analyzed in several math papers and blogs.

2. (The version I was assuming.) You are told that the envelopes have, say, $100 and $200. You pick one and you aren't allowed to open it yet. Now you're given the option to switch one last time. There is no problem with undefined priors and weird conditional probabilities in this version. However, the freaking paradox still holds! The expected value you get by switching is $150, no question about that. But the expected relative gain you get by switching is 1.25, there's also no question about that! This is the real paradox to me. Taking an expectation of a relative quantity is intuitively wrong, but why exactly?


Ad.1 If in the reasoning A is a fixed number then you can't assume that you have 1/2 probability that the other envelope contains more money. Probability depends on distribution and if you don't know the distribution it cannot be calculated. Fact that it cannot be calculated does not entitle you to assuming its 1/2.

If you have problem with that imagine stack of cards. You and your opponent pick one card. Whoever picks higher card wins. Until you see your card you have 1/2 probability of winning. But after you see you just picked 3 you see your probability of winning sharply changes (depending on what cards are left in the stack).

Ad.2 When you don't open your envelope you must treat amount in it (A) as random variable. Then all I wrote in post above applies.

It doesn't matter that you did not looked into an envelope. Because whether you got higher or lower amount is significant and you don't know which occurred you must split your reasoning and consider both cases. But upon splitting you must remember that condition you assume for each given branch of your reasoning must be taken into account. You can do this by pruning your random variable A with the appropriate condition and that influences its expected value. So by all means you can sum up expected values but not before appropriately adjusting them with the conditions you assumed for branches of your reasoning.

Consider solving equation x^2 + yx +1 = 0 While solving your may split you reasoning to three cases depending on whether y^2-4 is larger, smaller or equal to zero (even though you don't know what is the value of y). But when summing up solution you must remember that you assumed something about y in your branches so for the x you've found y can no longer be any real number.


Ad 1 - Not sure what you're trying to say, all I said is that you don't know the distribution of A - if you knew it, you could make an informed decision on whether to switch.

Ad 2 - If you carry out the analysis you're trying to teach me, you'll find that the expected relative gain from switching is 1.25. Now explain that. (You'll also find that the expected absolute gain is 0 - that's not a paradox.)


Let's denote expected value of money in envelope I picked with A.

If I want to switch I have to consider two cases. Let's label them e1 and e2. e1 means I picked envelope with smaller amount of money. e2 means I picked envelope with larger amount of money. Probability of each case is 1/2.

What is the expected value of money I'll have after switch?

Let's consider first case:

S(e1) = 2 * A(e1)

which mens two times the amount of money I had PROVIDED THAT I PICKED ENVELOPE WITH LESS MONEY.

A(e1) is not equal A If I denote exact amount of dollars in envelope with less money by X then A = 1/2 * X + 1/2 * 2 * X = 3/2 * X but A(e1) = X

Similarly considering case e2 I get expected value of money after switch equal

S(e2) = 1/2 * A(e2)

which means half of the amount of money I had PROVIDED THAT I PICKED ENVELOPE WITH MORE MONEY.

Again A(e2) is not equal A but 2 * X, if you prefer not to use X then A(e1) = 2/3 * A and A(e2) = 4/3 * A

To sum up probability of each case is 1/2 so S that I use to donate expected value of money I could have after switch is equal to:

S = 1/2 * 2 * X + 1/2 * 1/2 * 2 * X = 3/2 * X = A

or if you want without using X

S = 1/2 * 2 * 2/3 * A + 1/2 * 1/2 * 4/3 * A = A

So there is no gain or loss from switching. You can switch zero or more times without changing expected value of money you will get after opening your envelope.

Similar reasoning can be conducted for any amount of envelopes and any probability distribution and I guess it will to same conclusion. I'm not sure if observable symmetry of situation can be taken as proof of this conjecture but I sincerely hope so.

Paradox comes from using same symbol A to denote three different expected values of random variables which I denote above as A, A(e1) and A(e2)

If you have further doubts about above "relative" reasoning I'll gladly try to clear them up.


You just painstakingly carried out the absolute analysis, which I know works.

Let X be the value in the envelope you have, and Y in the other one (X and Y are both random variables with well-known distributions). Then E[Y/X] = 1.25. That's what I wanted you to explain. You just keep saying that E[X] = E[Y], which I know.

Note that this paradox would not arise if X and Y were independent, since then E[Y/X] = E[Y] / E[X] = 1.


What reasoning leads to claim that E[Y/X] = 1.25 ?

It can't be the reasoning from wikipedia about two envelopes problem because it has error in steps 4 and 5 so results in step 7 and further are nonsensical.

I painstakingly shown how this reasoning should look like to be right and that in fact E[Y/X] = E[Y] / E[X] = 1


You're combining two worlds and then adding up probabilities. But there's no world where all three possible values are in play at the same time. There's only two, and you either picked high or low. It's a 50/50 shot.


Which three worlds? After you have picked one envelope, there are two possible relative outcomes - 2 and 0.5. The average is 1.25 - there is nothing wrong with the math, the only problem is that taking an average of two relative quantities is intuitively not useful for decision making. Putting the intuition on a formal grounding is not something you succeeded with.


“You can't take an unknown and pretend it‘s a known.” is reasonable. However, the paradox still stands if you are allowed to open your chosen envelope and count the money inside before deciding whether to switch.

The problem genuinely does have to do with the use of an impossible probability distribution. It’s not just a straightforward mistake (as in the wrong answer to the Monty Hall problem, say).


Let's say you open the envelope. And there's $20. That means the set of two envelopes is either $20 and $10 or $20 and $40. In one case, the amount of money in play is $30, in the other, $60.

Now, you have no idea which it is. And the fact that you're holding a $20 doesn't tell you anything.

The $20 is fake information. It hasn't revealed any real info, as you knew you'd open something. The key is not to combine a world where the total envelope value is $30 and where the total envelope value is $60 which is what happens if you add up probabilities.

Instead, there's a 50% chance you're holding 1/3 of the total value, 50% chance you are holding 2/3. So if you switch, half the time, you add 1/3, half the time, you lose 1/3. net expected gain from switching: 0.


"use of an impossible probability distribution" - this sounds interesting. Which exact distribution do you mean? To me it seems (though I might be wrong) that the distributions are perfectly valid.

You have a pair of random variables (X, Y) that take values (100, 200) or (200, 100) with equal probability. Then E[X] = E[Y] = 150, there is no question about that. Also, E[X/Y] = 1.25, there is also no question. The only question is why E[X] is useful and E[X/Y] is not useful for our decision making - and I honestly don't know why.


If it were (100, 200) or (200, 100) with equal probability, you'd switch iff the first envelope contains 100. Nothing interesting about that scenario.

The problem states that (and is only interesting because) one of the envelopes contains twice as much as the other, but not how much, so that the amount in the first envelope tells you nothing.

This is actually impossible because there is no information about how the amount (let's say the bigger of the two) is distributed, which usually implies uniform distribution. But a uniform distribution is only possible if you assume an upper limit (otherwise, what's the expected value?). If there is an upper limit, the question again becomes quite easy (you switch if the first envelope contains less than half the upper limit).


"Nothing interesting about that scenario." OK, but let's say you're not allowed to see the contents of the envelope you picked.

In this case, the argument that by switching you get 125% on average still holds!

I'd claim that this is even more interesting and paradoxical than the scenario you're talking about, where your explanation is correct (the distribution of the bigger value is crucial for your decision, and you know nothing about it).


Isn't the expected value of both envelopes, prior to picking either, 0.5 X, where X is the sum of the envelopes' value?*

Only if the value of X is determinate. As matters stand for the person picking, it is not. It is simply undefined.

The solution to the "paradox" is simply that expected value doesn't always mean what we'd like it to mean.


Yanowitz defines X as the sum of the value of both envelopes. That amount is not undefined and not indeterminate. The amount inside an individual envelope isn't undefined either: it has a probability distribution.


Sorry, but no.

No probability distribution has been defined for the amount inside an individual envelope, and you can't just pull one out of thin air. That's not how mathematics works. If you had a probability distribution for that, you could make all sorts of arguments from it. But we're not given one, and we're given no information from which we can imply the existence of one. Therefore there is no such distribution. Period. (Interestingly, assuming such a distribution gives useful information even when it is the wrong distribution, but that is a complicated variant on the problem.)

Now you can argue until the cows come home about what probability "really" means, or what is "really" true in a made up problem. Plenty of philosophers are willing to argue that with you. But the way that mathematicians look at this is quite simple. You can only work with the information you are given, and you can't make up information that you weren't given. Arguing about the fine points of reality within a piece of fiction is, not to put too fine a point on it, just plain silly. After all it is a made up example. So don't do that.

We are given certain information. We look in the one envelope if we wish, and we get a dollar value. Based on that we can draw inferences about what might be in the other envelope. We can create an expected value for said envelope. All that is fine.

But our intuition about what the expected value has to say about what we should do is wrong. Nothing is wrong in the math - it is our intuition that is wrong here. Our intuition is based on what happens if you encounter large numbers of similar, but independent, events. However in this case we have a singular, and extremely unlikely to be repeated, event. Therefore the basis for our intuition is lacking here. And we are mislead.


No probability distribution has been defined for the amount inside an individual envelope

I don't understand this assertion. It seems to me that both envelopes have the trivial distribution:

  P(x=A) = 1/2, P(x=2A) = 1/2, P(x) = 0 for all other x.
and that this is clear from the phrasing of the question. Extracting this distribution from the text is no different than solving any other problem not stated in rigorous mathematical terms. With this distribution, with A properly defined (see the next point) upfront, the whole paradox disappears.

  --
Nothing is wrong in the math [..]

I think the statement in the article that

  4. If A is the smaller amount the other envelope contains 2A.

  5. If A is the larger amount the other envelope contains A/2.
is wrong, because it tries to define A twice with different concrete values and goes on to treat them as the same value. If your point is that saying 'A is the quantity in the envelop you just chose' makes A ill defined, then I guess you are right, but that doesn't mean that A cannot be defined properly and it doesn't mean the intended problem can't be solved.


> > No probability distribution has been defined for the amount inside an individual envelope

> I don't understand this assertion. It seems to me that both envelopes have the trivial distribution:

What I mean is that there is no a priori distribution in terms of actual numbers. What are the odds that the amount in the envelope before you open it is in the range $5 - $500? We are given no such information. Before you open the envelope you are given, A is a variable, but not a random variable. Its possible values ideally would have a uniform distribution on an infinite set, which is mathematically impossible. There is no such distribution. (There is an alternate approach which is to claim that that the sum of the two envelopes has an unknown distribution, which creates a distribution for the values of each envelope. This leads to a very different mathematical idealization of the problem. But the other distribution, being unknown, is not available to you. I will trace the reasoning for this through as well.)

If you open the envelope in front of you, A then becomes becomes a number, and there is indeed now a probability distribution for the contents of the other envelope. And that distribution is now, as you said

  P(x=A) = 1/2, P(x=2A) = 1/2, P(x) = 0 for all other x.
(Note, the alternate approach would now assert that the number in the the envelope provides unknown information on whether it is larger. Therefore the other envelope now has a different unknown distribution that depends on the actual magnitude of A.)

But now we arrive at a different point. Now we have a probability distribution for the other envelope, but our intuition fails for the simple reason that our intuition of the meaning of expected value is based on what happens after repeated opportunities at similar random events. (Insert the strong law of large numbers, etc.)

(The alternate solution finds that the other distribution causes the other envelope, on average over all possible values you could see, to have the same value as your own. However this is of no obvious direct use.)

From the point of view of the standard mathematical idealization, what you guys have been doing is trying to train your intuition to create arbitrary distinctions that avoid conclusions that bother you. But you're drawing a distinction that doesn't make sense. If you've opened the envelope and you have $10 sitting there, then A is $10. Period. It isn't one of two different numbers, it is the number you see in front of you. That has now become a provided fact. At this point you can draw a distribution for the other envelope. But your intuition about expected value fail. Why? Because this scenario, with this set-up, is not something that can become subject to repeated trials.

(The alternate approach comes to a different conclusion. Your intuition, they claim, is thrown by the fact that you do not have the critical information about the actual distribution of the other envelope, and so cannot come up with the correct numbers to decide. Interestingly in a bizarre twist, it turns out that if you make up a distribution and pretend it is the unknown one, as long as that distribution has some probability of answers falling in every possible range, your decision winds up being right better than half the time. However how much more you are right depends on the unknown distribution, and is therefore unknown to you.)


That is the clearest, most succinct explanation I've heard. Thanks.


So basically the fallacy is in counting gains only, without counting opportunity cost [of the switch].


Your math is right, but there's more to the story here, specifically to understand why the switching argument is wrong - this is commonly misunderstood, and has more to do with the implied "fairness" of the putting-numbers-in-the-envelope process than the calculation of the odds afterwards.

The super quick version: the number drawing process can't be fair, and that makes those .5/.5 hi/low probabilities wrong in general.

The real impossibility is the idea that we can "pick a random number" such that all numbers are equally probable, with no constraints on the size of that number. If this isn't intuitively obvious, then just try to normalize a uniform probability distribution over the positive real line, you'll see the problem.

In order to select the two amounts, the original problem statement implies that we need to randomly pick A (which we can take as the bigger envelope) in exactly such a way, so basically, the "paradox" is dead-on-arrival, because the true prior distribution that A was picked from must be different - for example, there might be an upper bound, say $100, in which case if we see that our envelope has any number over $50 on it, we are 100% sure that we hold the larger envelope. Otherwise, without some bounds or tapering of the distribution, we can't normalize it, and we can't use it to draw numbers from, so we have no numbers in envelopes to even discuss.

tl;dr version: If you can write me an algorithm that "randomly selects a number" (let's even make it an integer to make it even simpler - of course I mean an Actual Integer, not those short bit sequences that we pretend are integers when we program), such that no integer is more likely to be chosen than any other, then I've got two envelopes that I'd like to show you. :)

On another note, the truly insane might notice that in the same way that we can formally sum series like 1+2+3+... = -1/12 (http://en.wikipedia.org/wiki/1_%2B_2_%2B_3_%2B_4_%2B_%C2%B7%...), and 1+1+1+... = -1/2 (http://en.wikipedia.org/wiki/1_%2B_1_%2B_1_%2B_1_%2B_%C2%B7_...), we might consider the probabilistic analogues of these formulas, and "formally normalize" a PDF that is uniform over the positive integers, with a constant probability of -2 for every integer (whatever that means). Similar things could be tried over the real numbers, though it gets messier because the integral analogues of those formal sums tend to come out to zero (for instance, for any integer n, the integral of x^n from 0 to infinity, in a very specific sense, is zero - perhaps I'll do a layman's post on this phenomenon at some point, it's quite useful in some cases), so normalization doesn't really work out so well.

Of course, probabilities can't be negative under the usual rule set, so all of that would be pointless, wouldn't it? :)


Heard on the street?


Under a frequentist interpretation of probability that's correct (and the paradox doesn't arise), but Bayesian decision theory does allow you to reason in the way the paradox proposes, starting in a situation where you've already picked an envelope, and estimating the utility of switching to the other one by multiplying the probability that you're in the high->low configuration with the value of a high->low switch, and likewise with a low->high switch. So you have 50% chance of halving your money, 50% of doubling, and thus estimated 1.25x return.

Frequentist probability doesn't let you do that, because it doesn't let you say "there's a 50% likelihood I'm in this state, and 50% that I'm in the other". Instead, it says you must be in one or the other state (non-probabilistically), and the probabilities are only attached to a previous process that generated that outcome. But Bayesian probability does let you interpret the probabilities as belief in each current state, so you have a 50% belief you're in one state, and a 50% belief you're in the other, and the value of any decision is thus 0.5 * value_if_I_was_in_situation_A + 0.5 * value_if_I_was_in_situation_B.

(If you don't allow that kind of computation, Bayesian decision theory has to be revised in some other cases as well.)


No, Bayesian decision theory does not allow you to reason in that way.

Suppose the amounts in the envelopes are A and B. A Bayesian would say:

Prob(I have A) = 1/2 and Prob(I have B) = 1/2

The expected gain from switching is:

Prob(I have A)x(what I'd gain from switching = B-A) + Prob(I have B)x(what I'd gain from switching = A-B)

which is: 1/2 x (B-A) + 1/2 x (A-B) = 0


Ah yes, this one can be solved that way; oops. My understanding of the literature is that variants are much more problematic, though, and require more complex restrictions on typical Bayesian probability frameworks to exorcise them.

This is one good review, with a proposed solution (from a Bayesian perspective): http://books.google.com/books?id=14_ykEOAZ6AC&pg=PA49


The wiki page doesn't explain the flaw in the logic. I think it is this:

Let X denote the value of the larger envelope, and A the value of the envelope we have in hand.

If we have the larger envelope, the other envelope is worth A/2 (where A=X).

If we have the smaller envelope, the other envelope is worth 2A'. (where A'=X/2)

So far so good, and this is all legal. The error comes when we try to add:

(0.5 A) + (0.5 A'/2) /= 5/4 A

The A's are not the same. You can't add them as if they were.


Yes, I think you are correct. I would word it this way:

Let:

{X, 2X} := The amount of $$$ in each envelop

A := 1st envelop has $2X

B := 2nd envelop has $2X;

The flaw in the paradox is to assume that events A and B are independent, when in fact they are mutually exclusive.

Now, we can rewrite the whole "(0.5 A) + (0.5 A'/2) /= 5/4 A" nonsense as:

P(A|B)2X + P(B|A)X = 02X + 0X = 0

Which is zero because it only covers half of the probabilities space, the other half is where all the density of the probability function lies.

P(A|B')2X + P(B|A')X = 0.52X + 0.5X = 3X/2; which is what we would expect by intuition, halfway between the 2 prizes.


The wiki page used to have the solution. Apparently there is some arguing going on about sources, but this explanation is correct: http://en.wikipedia.org/wiki/Talk:Two_envelopes_problem#Solu... Also, yanowitz in this HN thread has a good (equivalent) explanation.


And yet, that explanation isn't correct.


i've realized my math was wrong and removed my comment :(

i hate this problem


I like Raymond Smullyan's version even better, it removes all probability issues and is so simple you feel it must be easy to figure out.

The setup is the same: two envelopes, once with twice as much money as the other, you pick one and before you open you are offered to switch. If you do switch you might either win money or lose money relative to what was in the envelope you chose first. Let's compare the possible gain to the possible loss (without worrying about the probability of having a gain or a loss):

1. The possible gain is equal to the possible loss.

Proof: The two envelopes contain x and 2x, so you'd gain x if switching from x to 2x, and you'd lose x if switching from 2x to x. The possible gain and the possible loss are both equal to x.

2. The possible win is larger than the possible loss.

Proof: Say the envelope you chose first has y. Then the other envelope has either y/2 or 2y. So if you gain money by switching you gain 2y - y = y, but if you lose in the switch, you lose y - y/2 = y/2. Clearly y > y/2.

This version kept up at night for days when I first read it. I never figured it out --not that I had any chance of doing so, given that Smullyan couldn't figure it out either--, I just got used to not understanding it...


This doesn't eliminate probability, and it's really the same flaw, just with different wording. The flaw is still that the variable is undefined (or infinite probability) in one part of the word problem, but treated as defined in another. So, start with when you first chose the envelope which has y. At this point "y" is undefined, and can be the higher or lower amount. That's the key. Yet, you then continue to propose that by switching you gain 2y - y = y. That equation is not okay, because "y" could have been the higher amount, and the equation breaks down. Once you assert the equation holds true, you have thus defined the original value of "y" from that first envelope choice (i.e. "y" was lower). The same flaw surfaces by asserting your switch to lose equation is true, as well.

Edit: BTW, the reason #1 works okay is because you define upfront the variable value ("x") as being either the higher or lower of the two possibilities, before constructing your equation(s).


You mention this doesn't eliminate probability but don't explain why you think that might be the case. I just meant that the argument Smullyan presents does not involve probabilities, random variables, distributions, expected values, etc. (This can be verified by reading it again and checking those concepts do not appear. ;)).

I don't find your argument convincing either, maybe it helps to remove the "undefined variable" you complain about:

The first envelope chosen has some definite amount of money, let's say $10 (but the same argument can be adapted to any amount). Then the other envelope must have either $5 or $20. If it is $5, switching loses you $5. If it is $10, switching gains you $10.

Imagine actually doing this experiment with the two envelopes, opening the first and seeing $10. Wouldn't you say the other contains either $5 or $20?


It doesn't eliminate probability because it's the exact same envelope experiment. ;) In other words, there is still a 50% probability of gaining or losing. My post above isn't an argument, it's an explanation. The "undefined variable" is established by your own proposal, i.e., in #2 you first choose an envelope which has an amount we call "y". At that point "y" is undefined, because you specifically explain it could be either the higher or lower amount.

I think your confusion lies in not considering carefully enough the meaning of your wording. In other words, you're performing a play on words to get a specific result, akin to the "heads I win, tails you lose" word play. Let me try to explain further below:

Imagine actually doing this experiment with the two envelopes, opening the first and seeing $10. Wouldn't you say the other contains either $5 or $20?

No. The other envelope contains the amount the person who prepared the envelopes placed in it.

Try defining both values explicitly and this issue may become clearer. Imagine one envelope contains $10 and the other contains $5. Go through your above proposed equations using strictly those values. Remember, in a real experiment that would be completely valid.


  I just meant that the argument Smullyan presents does not
  involve probabilities, random variables, distributions,
  expected values, etc. (This can be verified by reading it
  again and checking those concepts do not appear. ;)).
The fact that he doesn't uses those words does not mean he doesn't use those concepts. The Smullyan version is simply less precisely stated and anyone solving it would first have to go through the trouble to convert the phrasing to incorporate the usual, well defined mathematical words. Solving it any other way is just handwaving.


Statement 1 isn't true. If you have 10$ in the envelope, and it's the smaller envelope, you gain 10$. But if it's the larger envelope you don't lose 10$. The fallacy is that "x" refers to different things in each case.

Statement 2 is true but it isn't relevant, since you don't know the posterior probability given y. If the envelope contains the smaller amount of money then there is no way you can lose -- it's not a 50-50 chance.


That Wikipedia article is really bad. Here is a good article about the Two Envelopes Paradox, by Keith Devlin: http://www.maa.org/devlin/devlin_0708_04.html


While not exactly the same problem this is what cleared it up for me:

First pick a positive integer at random. Now pick a second integer at random. What is the probability that the second number is larger? The answer seems to be 1 because after the first number is chosen there are finitely many numbers smaller but infinitely many larger. But what if two people (A and B) choose two positive integers at random but don't tell anyone just yet. What is the probability that A>B? By symmetry we can see it should be 0.5 . So A tells his number. Now the probability that A>B seems to jump to 0 because there are finitely many numbers smaller than A but infinitely many larger. But why should the probability change when we don't even know what A is, just that it known?

The problem lies with the phrase "choose a positive integer at random". However you choose the number you will be hopelessly biased toward 0 because every number you choose will be finite. Now think about the envelope problem but add the condition that the largest amount that any envelope will have is X. Then if your envelope has N in it there is a chance that 2N is larger than X. That cut's down on the chance that switching gets you more. Magically (because I'm too lazy to actually write it all out) it turns out it cuts it down just so the probability of getting more money by switching works out to .5


"First pick a positive integer at random."

Wrong already. There exists no equiprobable probability distribution for integers, so you cannot just pick a positive integer at random, you need to define a distribution and it will definitely be biased - some ints will get chosen with larger probability than others.


So A tells his number. Now the probability that A>B seems to jump to 0 because there are finitely many numbers smaller than A but infinitely many larger.

This is not correct. You said B chose a positive integer. Therefore, there are not infinitely many larger numbers B's choice can be.


Sure there are. No matter what finite positive integer you choose there are infinitely more finite positive integers greater than that number.


That's true, but what te_platt said was "two people (A and B) choose two positive integers at random"

If two people, A and B, have chosen -- that's choose in the past tense -- two positive integers, there are not infinitely more positive integers B's number can be greater than A's number. Why? Again, because B has already made a choice.


Actually, it doesn't matter. If you rephrase the problem so that B picks after A and you ask the question "is B likely to pick a larger number than A" before B picks then the answer should be "yes".

However, this is another example of being misled in reasoning. But not related to timing. The important revelation is understanding that there's no such thing as a method for generating truly random numbers across the entirety of the positive integers. Since there are infinitely many the probability of picking any one integer is 1/infinity. Any concrete random algorithm for positive integers will be hopelessly biased toward 0 (since it will almost certainly have a maximum value, and thus an infinite number of positive integers above that maximum). As long as both A and B use the same algorithm then the chances are equal that A or B will pick the larger number.


If you rephrase the problem so that B picks after A and you ask the question "is B likely to pick a larger number than A" before B picks then the answer should be "yes".

I agree, but that wasn't what was stated. You've "moved the goal posts" so to speak. ;) The OP said both A and B had chosen a number (but only one had revealed it). That introduces TWO finite values into the experiment. The timing for when B makes the choice makes all the difference.


First a random observation. People use "expected value" to evaluate risky decisions far too often. In many investment situations a more appropriate measure is "expected value of the log of my worth". (See http://elem.com/~btilly/kelly-criterion/ for an explanation of why.) That measure, in this problem, says that you should be indifferent about switching.

Admittedly that is coincidence - the investment reasoning for that rule has little to do with why that rule works for this problem.

But it turns out that there is an even crazier twist to this problem. Change the problem to say that you're allowed to look at the envelope you receive before deciding whether to switch.

Would you believe that there is an algorithm for deciding whether to switch that results in your getting the larger envelope more than 50% of the time? The catch is that those odds depend on the unknown amounts in the envelopes. But, no matter what is in the envelopes, you are guaranteed to get the larger one more than 50% of the time.

See http://www.perlmonks.org/?node_id=39366 for an explanation. Read it very carefully, because the problem is very subtle. Even a slight change in what is meant tends to render the problem ill defined and indeterminate. (And our intuition is very bad.)


That blog post is wrong. If N is the difference between the values in the two envelopes, then the chance that you will guess a value in this range is N/∞ which is 0.


Not so. The only stipulation was that you must be able to guess a number in any range (which is not possible in practice). It doesn't matter that the probability of picking any particular number will be 0, as long as the probability of picking any one between the two in the letters isn't 0. Any time your guess hits the correct range, you win. Any other time, it depends on whether you got the big envelope or the small one.

So knowing just one endpoint of a range is, oddly, evidence as to which way the other endpoint lies - as long as it might be either one. You just take a guess that depends on the number you know in such a way that the smaller the number you see, the more likely you are to guess it's the smaller of the two, but are never certain. Presto, guaranteed you'll get more than 50% right, as long as you get fed both the small and the large envelopes in equal proportion.

If you can't be sure the other guy is playing fair and might favor giving you the low envelope, you need to look at something like the Monty Hall problem.


You did not read carefully enough.

If N is the difference between the values in the two envelopes, then the chance that you will guess a value in this range depends on the actual values. If you average over all possible intervals it averages out to 0, but for every single interval the odds are greater than zero.

The non-intuitive (yet mathematically straightforward) fact that you can average an infinite set of positive numbers and get 0 is part of why our intuition is so badly mislead in this case.


I think the problem is they effectively change a problem from having 2 possibilities (either you get $X or you get 2 * $X) to a problem that has 3 possibilities: A/2 (1 possible amount in the other envelope), A (the amount in your envelope), and 2A (the other possible amount in the other envelope) by fixing A relative to the hypothetical amount you could have in your envelope.


Yeah - it's not really (0.5A + 2A)/2, it's (0.5A1 + 2A2)/2, where A1=2*X (as you have above) and A2=X.

Substituting, your expected value upon changing is 3X/2, which is, surprise surprise, the same as the expected value for the envelope you already have.


The flaw in the argument for switching comes in its second line:

"1. I denote by A the amount in my selected envelope.

2. The probability that A is the smaller amount is 1/2, and that it is the larger amount is also 1/2."

If we're gonna be taking expected values we need to assume that the monetary amounts in the two envelopes are generated from a probability distribution on the nonnegative reals. Then the probability distribution for the amount of money in the smaller envelope is going to be some kind of curve (call it S), and obviously the probability distribution for the amount of money in the bigger envelope is going to be the same curve only "stretched out" and "squashed" by a factor of two (we'll call it B).

Now if you say "my envelope contains exactly A dollars," you can tell what the probability is that your envelope is the smaller envelope by comparing the relative heights of the two curves at the value A; let's denote these heights by S(A) and B(A). It is certainly possible that the two heights are the same, in which case it's fifty-fifty that you have the bigger or smaller envelope, the rest of the logic holds, and you should indeed switch, getting an expected return of 5/4 * A.

But it is obviously impossible that in general the two heights are the same for any given A, because then the two probability distributions are the same -- and that can't be true, because B is a squashed, stretched-out version of S, and for a variety of fairly obvious reasons you can't squash and stretch out a finite curve on the positive reals and get the same curve (unless that curve is 0 everywhere). And so we can't conclude that in general you should switch, which is good because if you're not allowed to look at the money before you switch it obviously doesn't matter whether you do or not (unless the people running the game are messing with you).


There's a similar problem with three closed doors, one of which contains a prize behind it. You are asked to choose a door, and then shown one of the two remaining doors which does NOT contain the prize. The question is should you switch doors, given the opportunity? What is the probability that of getting the prize if you switch, and if you don't switch?

To illustrate, let A, B, C be doors, and door C be the door with the prize. You choose door A; the host tells you the prize is not in door B. What is the probability you will get the prize if you switch your door?


No, this problem is different, and has a different solution. In your problem it makes sense to switch when given the choice, whereas in this one it does not. With the three doors you're given additional information once you choose a door, which allows you to choose a door with 1/2 probability of being right, rather than the 1/3 chance you started out with. In this problem you are given no additional information, so the probability of choosing the right envelope is the same when given a choice to switch as it is at the beginning (i.e., 1/2).


Actually, your probability of choosing the right door is 2/3 if you switch. I agree though that this problem is not the same, it just reminded me of it -- I guess you could say it's a similar class of problems.


It's similar in that it's easy to fool people into thinking about the problem incorrectly and coming up with the wrong conclusion.

People are very touchy about the monty hall problem and will refuse to believe that it's better to switch, even when confronted with numerical simulations of the problem.

As for this problem, it's more of a riddle than a paradox, the reasoning is erroneous but you may not realize that if you're not careful.

It's more similar to the 1 = 0 riddle or the 3 guys paying $10 dollars for rent riddle, tricking you into making a mathematical error that isn't immediately obvious.


The problem with this is very subtle.

The problem is saying that it is equally likely that the other envelope contains half, or double, the money of the envelope we have chosen.

It is impossible to choose a value from an infinite series (the list of possible monies), with every value having an equal opportunity of being chosen.

Therefore the probability of money distribution must be non-even in some way, and the argument falls to pieces.

It is not reasonable to say "Well, I don't know the probability, so I'll treat it as 50/50".


No, that's not the problem. You don't know how much money is in either envelope, but it needn't have been selected randomly. The only randomness is whether you the envelope you choose has the greater or lesser value, which is just a uniform distribution on two points.

That being said,

>It is impossible to choose a value from an infinite series (the list of possible monies), with every value having an equal opportunity of being chosen.

this is also incorrect. You're right to say that you can't have a uniform distribution over the possible values of money, but that's not just because it's infinite. You can have a uniform distribution over [0,1] for example, which is larger than the set of possible money values.


>You don't know how much money is in either envelope, but it needn't have been selected randomly.

This is exactly wrong, at least in the mathematical formalization of this problem that everyone is assuming. The definition of "don't know how much money" must be taken to be "has a uniform distribution".

If you want to use some reasoning that treats the amount of money as an "unknown in an equation" then you are instead giving a solution that differs for each possible value of the money and does not capture the random choices we are trying to model.

By the way, no one said that uniform distributions on infinite sets don't exist. He said "infinite series (the list of possible monies)", a slightly mis-used term that clearly excludes [0,1], your chosen example.


This is precisely the correct answer — and it's disappointing to see a less good answer getting so many votes when this one hasn't had any.


The Hacker News community has a strange and sad relationship with actual math, that is, math as mathematicians define it. Math formalizable in ZFC and not arguments of A/B test effectiveness and VC funding.


The swapping indefinitely does not work. For the first iteration, if I have A in my envelope, then yes, the other envelope will either have 1/2A or 2A, meaning the expected value is indeed 5/4A. The switching argument wrongly assumes this is again the case after I switch. After I have switched, the value of the original envelope is unchanged (meaning it is A with probability 1), so I should not switch back.


Are you seriously arguing that you should switch exactly once?


I believe there's some problem with assuming that, given an envelope containing X dollars, that the other envelope has a 50% chance of containing 2x, and a 50% chance of containing X/2. This step is clearly what leads us into the paradox.

Suppose you open the first envelope, and find $100, and are given the option to switch. If there is a 50/50 chance of the other envelope containing 200/50, we could easily model this problem a million times and find that it averages to 125.

The problem here is that the numbers have not been pre-determined in the start. If a naive player repeatedly played a game where the numbers are guaranteed to be 100 and 200, then the always switching strategy would be exactly the same as the always staying strategy.

In this case, let the difference between the two envelopes be x. When you switch, you have a 50% chance of losing x and a 50% chance of gaining x. There is really no problem if you define the two envelopes as differing by a fixed number, rather than one being a multiple of the other. When the two numbers have fixed, predetermined values, the two problems are the same.


Interesting... here's my "gut reaction" as a non-statistician. The problem is that 'A' has a probabilistic component as well. If you let X be the "lesser of the 2 dollar amounts" and let A = X/2 + 2X, then you come out with equal expected values for switching and non-switching.


I believe the root of the paradox is that there exists no distribution of two variables A,B for which:

a) (A,B) always fulfills A=2B or A=0.5B and b) For any value of A, the conditional probability of B=2A is equal to the conditional probability of B=0.5A.


A friend of mine told me a similar problem that he got at D.E. Shaw interview.

Person A writes random real numbers in two envelopes. Then person B randomly 50/50 picks one envelope and sees the number X written there. Then, B has to guess whether X is the bigger of the two numbers or not. Show that B has a strategy, that even if A knows it, B can guess correctly in more than 50% of the cases.

The mathematical beauty of this problem is that you can derive what are the non-trivial steps towards the solution.


For me, what is interesting in that problem (besides the obvious flaw that people here discussed) is that the answer changes is the value A is observed. I.e. :

If I have two envelopes, one of which has twice the money of the other, you pick an envelope AND YOU SEE the money inside, then it becomes better to switch (in expected money).


I found this paradox so compelling that I blogged about it a while ago. I tried to tease out the "intuitive" explanation without resorting to much statistics. http://techiferous.com/2010/06/the-two-envelope-paradox/


Because you're negating a random choice and it's not like Monty Hall where a bad choice is removed for you.


I didn't get that Monty Hall problem for a while after seeing it, but then it clicked and I got it. It's really beautiful to think about:

http://en.wikipedia.org/wiki/Monty_Hall_problem


You denote with A two different things, and this is not correct. Suppose you have a bag with two animals a dog and a fox. You choice an animal A ten times, and then another animal A ten times, so you have 20 A. The question is: Do you have twenty dogs or twenty foxs?


Ok, I'm not a huge math wiz and after reading this article 2x, I'm still at a loss as to what the answer is. So after we have the nice breakdown in the form of a mathematical proof, we should keep switching envelopes forever? I'm the kind of guy that isn't as interested as to the joy of solving the problem. I just want the money!

If I were to break this down into layman's terms, would it be appropriate to say the following?

"There's equal odds whether you keep the envelope or exchange it for the other one. There's no difference either way, so flip a coin and decide."


The example seems to miss something critical. If the dollar amount of the first envelope is odd, then it can't be the doubled sum...


Key takeaway:

In the words of [David] Chalmers this is "just another example of a familiar phenomenon, the strange behaviour of infinity".


  for all $X > p, (price to play)

  myLifeQuality(p,X) > myLifeQuality(X,2X) > myLifeQuality(X,X/2)
easy


I got that in a tech interview and hadn't seen it before. Dirty.


In what context was the interview (ie type of position).

I'm all for logical problem solving questions in interviews, even ones that can't be solved but are interesting to see how people tackle them.

However this one doesn't seem to lend itself much to programming or true problem solving - just probability theorem.

In other words, I'm not sure what useful info I'd get out of a candidate by asking them to work on this (other than their understanding of probability, which is not that useful for most programming jobs)


And that's Numberwang!


can't eat your cake and have it too


I think this is better posed as two random-drawn sets of numbers for the first drawing of an as-of-yet un-played lottery. Then, there's no value in considering extraneous data such as the mass of the envelopes, looking through them or other potential evidence.

Instead, you're left with two identical chances. Since there is no data to suggest that either has an advantage over the other, you have a probability of exactly 0.5 that you have the better chance of winning.

On the question of the switch, it's important to consider that you have no new data about the one you picked, so the odds are _exactly the same_ as they were at the beginning: there is no conditionality, nothing has changed since you picking it.

The probability then of the winning ticket is still 0.5. This is where the infinite probability problem kicks in.

I don't think this is an actual problem needing to be solved, unless we can find a way to mathematically express the probability modifiers of our gut feeling, which is impossible, as it comes into our own personal experiences.

Still, Gladwell might have something to say about it :)


Trouble is, we are making it a mathematical problem. It need not be. The same logic applies to human [as opposed to mathematic] problems like: 1. Don't you wish your girlfriend was hot like me? 2. How much money is in the first envelope, to begin with? Whatever there is, I can get it to work to better use in my business, than some class 8 formula. 3. How many birds are there in the bush? 4. What if...

All right. Comments notwithstanding. Seriously - how much money?


This was a seriously crass comment. What was I thinking? My apologies fir trivializing the original inquiry of the thread.


The 'paradox' that is presented leads back to the right conclusion: there is no way to inform which envelope is better so just pick one at random.

These are non-causal random events, and the probability is always 50/50.

There is a twist to this problem, which is more interesting, where you open the first envelope, then decide whether or not to switch to the second envelope.

Believe it or not, there is a trick to beating 50% in this variant.

I'm too tired to try to fully explain it, but this conclusion comes from the fact that once you look at an envelope that presents information that can be used for deciding to switch. So it becomes a causal system that does not follow a random probability density function.

But think: if the envelope you opened had $7 million dollars in it, you probably wouldn't and shouldn't switch, and you'd probably be able to skew the odds in your favor on scenarios like this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: