Hacker News new | past | comments | ask | show | jobs | submit login
Two envelopes problem (wikipedia.org)
42 points by tchajed on Sept 14, 2013 | hide | past | favorite | 88 comments

You can even explicitly define the distribution as P($2^i) = 1/(2^i) and the 'paradox' remains. The problem is that you are reasoning about the expected gain of swapping but that expectation is a sum over a series which is not absolutely convergent (http://en.wikipedia.org/wiki/Absolute_convergence) so it's value depends on the order in which you sum over the different cases.

In fact, if you repeatedly simulate the problem (make sure you use bignums) you will find that the mean gain from swapping is not well behaved at all and refuses to converge. The law of large numbers only applies if the expectation is well-defined.

The lesson here is: when in doubt, explicitly write out the probability space over which you are working. Problems like this and the Monty Hall problem are trivially solvable on paper. There is a reason that mathematicians get all hot and bothered about formalism - it gives you a solid base from which to build correct intuitions.

EDIT Let's write this down properly.

    i | gain from swapping if I have the smallest | gain from swapping if I have the largest
    1 $2 -$2
    2 $4 -$4
    3 $8 -$8
There are two arguments.

The first is that the situation is symmetric so you can't possibly gain. That is:

    E(Gain) = (1/4 * $2 + 1/4 * -$2) + (1/8 * $4 + 1/8 * -$4) ...
            = $0 + $0 + $0 + $0 ...
            = $0
The second argument is that swapping from small to large is a bigger gain than the loss of swapping from large to small. That is:

    E(Gain) = (1/4 * $2) + (1/4 * -$2 + 1/8 * $4) + (1/8 * -$4 + 1/16 * $8) ...
            = $0.5 + $0 + $0 + $0 ...
            = $0.5
Adding up an infinite series is tricky :)

Well, yeah :) The problem is still in assuming something about the distribution without having any good reason to. In original "paradox" reasoning it was impossible uniform distribution. What you show is that even assuming possible distribution could still lead nowhere. It still doesn't mean there is any reason to assume it's P($2^i) = 1/(2^i). I may just as well assume it's exactly 100$-200$ and that switching from 100$ gives me guaranteed payoff. It would be as baseless as assuming uniform distribution or the one given by you.

I think the main lesson from this puzzle is that you can't assume stuff without good reason to.

I'm not assuming that that is the distribution. I'm saying that the problem as presented is underspecified, but even if you give this specific distribution in the problem you can still cause confusion. Assuming stuff is bad but the original 'paradox' still doesn't go away if you nail everything down. It's a useful problem for education people about the subtleties of infinite sums.

Well, there are distributions which don't have well defined expected value and I agree that it's valuable lesson. I would argue though that this is not the best example to show it. The point of two envelope problem comes much earlier and specifically giving distribution defined by you would make it completely different problem and assuming it in original form would already be a mistake.

I think there are better examples to show how infinite sums and relying on expected value based on those might leads to problems. Like this one for example:


The expected value in the St Petersburg game is actually well-defined - it converges to positive infinity. That is a subtly different result than not converging at all. Both problems are useful.

This is either way over my head or not at all clear. More explanation would be appreciated.

Sorry, I rushed it. Did the edit make it clearer?

No, quite the opposite unfortunately :(

I guess I skipped a lot of background knowledge :)

The quantity everyone is arguing about is 'the expected gain from swapping envelopes' (http://en.wikipedia.org/wiki/Expected_value). Informally, you might say 'the average gain from swapping envelopes'. The way you work this out is you take all the possible things that could happen and for each one multiply the probability of it happening by the amount you gain if it happens. For example, suppose we play a game where I toss a coin. If it's heads I give you $2 and if it's tails you give me $1. Your expected gain from playing this game is:

    E(Gain) = P(Heads)*Gain(Heads) + P(Tails)*Gain(Tails)
            = 1/2 * $2 + 1/2 * -$1
            = $1 - $0.5
            = $0.5
If you play the game repeatedly, your average gain will converge to the expected gain ie if you played the game a million times you would win pretty close t half a million dollars. That's called the law of large numbers (http://en.wikipedia.org/wiki/Law_of_large_numbers). That's why we care about expectation - it tells us what would happen on average over large numbers of repeated trials. It's also very simple to calculate and quite intuitive to reason about.

In the case of the two envelopes the problem comes from the fact that there are an infinite number of possible amounts in the envelope. It turns out that adding up an infinite series of numbers doesn't always behave the way you would expect eg

     1 + (-1  +  1) + (-1  +  1) ... = 1 + 0 + 0 ... = 1
    (1 +  -1) + (1  +  -1) + (1 ...  = 0 + 0 + 0 ... = 0
In the example above we are adding up the same numbers in both cases, but depending on how we group them we get different answers. The same thing is happening in the envelope problem. The different arguments in this thread are just different ways of adding up all the possible cases and they get different results. The actual problem here is that the theory of expectation only applies when the sum is well behaved (http://en.wikipedia.org/wiki/Absolute_convergence).

Because the expectation is not well-defined for the envelope problem the law of large numbers does not apply either. If you play the game millions and millions of times your average win per game will not settle down but will keep jumping around forever.

Like the Monty Hall problem (http://en.wikipedia.org/wiki/Monty_Hall_problem), it's interesting because even expert mathematicians often get the wrong answer if they don't carefully work it out step by step. It shows the value of having a formal system of probability to back up intuition.

If you want to learn more about this sort of thing there is an excellent textbook which teaches probability theory using randomised algorithms (eg uniform hashing, load balancing, queueing theory etc).


It's one of my favourite textbooks and it's full of powerful methods and intuitions for any programmer.

NINJA EDIT PLUG: If anyone wants to work through that book I would be more than happy to help out and answer questions (jamie@scattered-thoughts.net). I wouldn't mind a refresher myself and I've always found that explaining things to other people helps clarify my own thinking.

Sorry but right after you finish the basic probability coverage it stops making sense.

-> In the case of the two envelopes the problem comes from the fact that there are an infinite number of possible amounts in the envelope.

Right there.

Well, with the distribution I gave it could be $2 (with probability 1/2), or $4 (with probability 1/4), or $8 (with probability 1/8) etc. That distribution doesn't have a maximum.

For a simpler example, imagine tossing a coin and counting the number of tosses it takes you to get a head. There is no limit on the number of tosses it might take, it just gets less and less likely as the numbers get bigger. We could get any positive integer and there are an infinite number of integers.

Similarly, the envelope problem has an infinite number of possibilities because there is no maximum amount that could be in the envelope. The article doesn't specify the exact odds of any given amount showing up in the envelope so I gave some specific odds that demonstrate the problem.

Probability theory still works when you have an infinite number of possibilities but it has a few subtleties that aren't commonly taught until undergrad. One of those is that the expected/average value of a random variable doesn't always exist so you have to be careful when reasoning informally.

This is one of my favorite problems. Here is one variation can help clarify thinking about it:

"There's two envelopes. One contains twice as much money as the other. No envelope contains more than $N."

This changes the problem dramatically. If your envelope contains more than $N/2, of course you should not switch: the other envelope necessarily contains less. If it contains $N/2 or less, perhaps you should switch: the other envelope may or may not contain more.

Say your envelope contains $x, and you don't know what $N is. There's two possibilities:

1. $x <= $N/2. If you switch, your expectation value is $3x/2, by the original argument.

2. $x > $N/2. If you switch, you will get $x/2, since no envelope contains more than $N.

If we assume the distribution is uniform on the range [0, N], then these possibilities are equally likely. Therefore the total expectation value is the average of the EVs of the two possibilities (3x/2 and x/2), which is $x. So we recovered the naive expectation of "it doesn't matter" from this variation. Now we can take the limit as $N goes to infinity, and while the EV of $x approaches infinity, the fact that switching does not matter does not change.

As others have said, the underlying problem is the assumption that a uniform probability on an infinite set makes sense, which it does not. However, we can instead take the limit for finite sets, in which case we recover the intuitive result that switching does not matter.

Exchanging limits and expectations is not always valid (eg http://www.stanford.edu/class/msande322/docs/app_B.pdf‎). In this case the expected gain is not convergent in the limit. See https://news.ycombinator.com/item?id=6387344

This is one of my favourite problems too :)

That is a very long article based on flawed argument.

Given no other information, assuming someone gave you 2 envelopes and told you one has $40 vs $20, common sense dictates choose 1 randomly and walk away - with no other information it is illogical to reason any other way.

The chance you choose the lower value is 1/2.

Now, if you are allowed to look inside the envelope (which gets introduced further down) then it becomes a different game.

Get $20...well by swapping you may get $10 or $40...you should probably swap.

Get $2000...well by swapping you may get $1000 or $4000...you should probably swap.

I think this works all the way up...someone with a bit more background on game theory may be able to formalise it, but the realisation that swapping forever leads to $0 nullifies this "paradox"

Yes, it's not a paradox it's just seductive flawed reasoning. Yes, at any point EV of picking an envelope at random is 3/4n (n being higher amount of money out of the two). It is all there is to it. The "paradox" is introduced by silent assumption that distribution of amounts put in envelopes is uniform which is impossible (because you can't pick numbers from infite set uniformly even if there was infinite amount of money in "adversary" disposal). The assumption is then used for conditional probability calculations: "if we see 10$ there is 50% chance the other envelope contains 20$" - BEEP, ERROR, THINK AGAIN.

Perhaps good exercise in clear thinking but not really a paradox. Good analogy is this: "If we pick random building and climb to the roof of it there is 50% chance first building we see is higher than the one we just climbed". This is obviously true, now following "paradoxical" reasoning we get: "If we climb a building randomly and see it's the Empire State Building there is still 50% chance first building we see will be higher".

This is exact analogy to reasoning about 2 envelopes problem which is supposed to lead to a paradox.

You can explicitly state the distribution and still run into the same problem: https://news.ycombinator.com/item?id=6387344 .

The underlying problem is basically that probability theory in non-finite spaces has some gotchas - one of which is that the expectation of a random variable does not always exist.

Interesting point and nice read. Still the problem is in assumption about underlying distribution of amounts in envelopes (in original case impossible uniform distribution). The reasoning is based on this assumption and leads to nonsense. What you are saying (I think) is that assuming some other distribution (possible one, instead of impossible one) could still lead to nonsense or doesn't lead anywhere at all.

Not so much that it leads to nonsense as that naively applying expectations doesn't always work. This is a contrived example, but it's not uncommon in eg random walk theory to hit upon cases like this where the expectation does not exist at all.

People commonly think of mathematics as being purely about formal proof but the reality is an interplay between proof and intuition. Usually when a mathematician encounters a problem in a familiar area they immediately know the answer by intuition which then guides the production of a correct proof. When you first enter a new area of mathematics your intuitions are all completely wrong and you have no idea where to start with a proof. Good teachers will introduce edge cases like this problem to refine your intuition until it is useful enough to be a guide.


Code that shows ev stays at 3x/2 (if x is the lower amount) or 3n/4 (if n is the higher amount)


Common sense in the Monty Hall problem says just pick a door and stick with it. The chance you chose the goat is 1/2, right?

No. Common sense is often wrong.


Except that in the Monty Hall problem something has changed after the initial choice: one of the previous options is proven to be a goat. No such change occurs with the two envelopes, no new information is made available, thus no basis for switching.

There is additional information in the Monty Hall problem: the host never opens the door with a car. In the two envelope's problem, the "host" merely restates the question no matter which envelope was chosen.

Folks, it's an analogy. An analogy compares two different things... trust me, I know they're different and I know in which ways.

All I was saying is common sense doesn't get you far in the Monty Hall problem. It really doesn't:

>[Vos Savant] received thousands of letters from her readers; 92% of the general public, 65% of universities, and many with PhDs, were against her answer.

Thus, it's kind of silly to say we're wasting time by going past a common sense analysis of something. It's also revisionist to say common sense does help you solve Monty Hall.

Sorry for misunderstanding you. I guess I lacked a bit of common sense...

The montey hall problem can be restated in a way that makes it amenable to common sense. Door A or (Door B and Door C).

Or, restate the problem with 100 doors.

I'm not sure every problem like this one can be reduced to a common sense analog, but I'm suspect of problems that can't be.

Suppose you restate it. You own a $100 stock. It has a 50% chance of doubling, and a 50% chance of going down 50%.

Should you sell it, or hold onto it? Expected value of holding is ($200 + $50)/2 = $125 . So it seems like you should hold on!

If you repeat that wager indefinitely, the standard deviation of the net wins (number of ups-downs) goes up according to a square root law.

And the value of your stock goes up exponentially as the number of net wins goes up.

So over time the EV goes to infinity like (2 ^ (n/2)). The EV of say the -1SD outcome goes to zero. The EV of the +1SD outcome goes to 1/that. So the average EV overall goes to infinity.

And yet the expected value of the growth rate is 0. For every 16x win there's a loss down to 1/16th. But the average of those 2 outcomes diverges to infinity.

All this to say, when you're looking at exponential returns (or other processes), you need to measure growth rates, not average outcomes. And it has nothing to do with log utility.

Javascript simulation (might freeze your browser so would be a good idea to run it in Node):

    var t = 100; for (var i = 0; i < 1e6; i++) { if (Math.random() > 0.5) { t = t * 2; } else { t = t / 2; } console.log(t); }
It's interesting how quickly the return goes from extremely small to extremely big amounts.

yeah. Now I wonder what happens when you have $10,000, and always put 1% of your portfolio in that stock? I think that turns it into a positive growth expectation. You do it 10 times and outcomes are 50/50, 5 times you made about $100, 5 times you lost about $50.

On the other hand, if you bet your whole stack each time, you doubled up 5 times and lost half your stack five times, you're even.

if you bet your whole portfolio every time, I think your long run growth rate is 0. if you bet a small amount each time, I think your growth rate is positive.

The Kelly Criterion or gambler's curse in action. In the first case you're taking a positive EV bet and turning into a long run no-growth situation by overbetting.

The way I see it the flaw is in the first sentence of the "example":

> Assume the amount in my selected envelope is $20.

You can't just pull that assumption out of your ass. You can only assume what the problem states, which is that the values in the envelopes are X and 2X and you had a 50% chance of choosing either.

If you run the math without adding assumptions, it works out that swapping makes no difference, statistically.

Though it is a very clever trick :).

Taking out the specific dollar amount doesn't change the math at all. If X designates the amount in the envelope I've selected, than 50% chance the other envelope contains .5 * X and a 50% chance it contains 2 * X, so the expected value of the other envelope is .5 * .5 * X + .5 * 2 * X = 1.25 * X which is greater than X.

but you are implicitly turning the scenario into one with three values, 1/2X, X, and 2X, so something went wrong with what you're doing. There was only ever a universe of two values.

Sure, these are variables, so they correspond to different possibilities. X can be anything, and given X, the amount in the other envelope is one of two different possible values - 1/2X OR 2X.

So there are far more than three possible values, but only two actual values.

Exactly, that's why your model is broken.

I think the key assumption is there is a 50% chance of getting double and a 50% chance of getting half in the swap scenario. Once you pick an envelope initially the chance disappears; you can't parlay the chance into the new context.

> If you run the math without adding assumptions, it works out that swapping makes no difference, statistically.

I beg to differ :)


All the probability math therein, for a problem whose solution is highly intuitive (if you swap, you'd be just as inclined to swap envelopes indefinitely, is all you need to realize), reminds me of this quote:

"The intuitive mind is a sacred gift and the rational mind is a faithful servant. We have created a society that honors the servant and has forgotten the gift." - Albert Einstein

Shameless plug for a blog I like that has little math, yet uses much intuition to solve five of the biggest outstanding problems of physics: http://finbot.wordpress.com

Theoretical resolution of "intuitively obvious" paradoxes such as these are important.

If we cannot find a theoretical resolution, it can indicate a flaw in our theories, and that will likely provide a more accurate set of theories.

The problems around the speed of light gave rise to Einstein's theories of relativity. Another of those flaws ("the set of all sets that are not members of themselves") gave rise to modern set theory and formal logic.

Still a nice quote from Einstein though :-)

Terence Tao has some interesting writing on the interplay between rigour and intuition:


Nevertheless, this problem is interesting because people tend to trip up on the maths. The actual behaviour, even in simulation, is very unintuitive (the mean gain from swapping envelopes does not converge to zero over time, and in fact does not converge to anything at all but oscillates wildly). It's a good introduction to the subtleties of probability theory in non-finite spaces.

I don't see the paradox... If you have 2x and swap you lose x. If you have x and swap, you gain x. 0.5(-x) + 0.5(x) = 0, So you should be indifferent to swapping.

A I missing something? Not to say I don't make mistakes, but I have a BS in mathematics so maybe this is only obvious for people with a background in math?

EDIT: No need for dollar values.

I don't have a BS in maths, but the important bit, as I read it, is that you don't know the values involved. I.e. if you have 20, there is either 10 in the other envelope, or 40. You have no way of knowing which is the case, so it's in your interest to swap since the benefits outweigh the risks.

This is totally counterintuitive, though, so I'm fully willing to accept I'm missing something! And I don't buy the 'indefinite swapping' argument, since the second swap must surely reverse any advantage gained.

Once you have the second envelope in hand, since you don't actually know what's in it, the exact same expected-value argument applies to switching back: there might be $10 or $40 in the other (first) envelope, "so it's in your interest to swap since the benefits outweigh the risk".

While the second swap would reverse any advantage possibly gained, it would also reverse any possible harm sustained You dont know. The situation is perfectly symmetric; picking and switching is the same as just picking the second one at first. So just pick one.

Env A has $20 (we have it and know). Env B has $10 or $40 (equal chance).

We swap. Now we hold Env B, which has $10 or $40 in it. We know the other envelope (Env A) is $20. Why would we switch again?

The original statement of the problem says you're offered the choice before you open the envelope. I guess I was sloppy about the $20 assumption too. As for the case you do know, I still feel wrong about the conclusion that you should switch, but I can't formalize it.

Here's my contribution:


Approach 1: Absent new information, we cannot improve our outcomes.

In the montey hall problem there is either obscure new information, or a obscure change in the rules between firs choice and second choice.

Montey hall collapses to an initial choice of the prize behind door A or the prizes behind both door B and C. When the true, collapsed, choice is revealed the common sense reasoning is correct.

In this problem there is no new usable information.

In the two envelopes problem the new information appears relevant but is actually not on it's own any more useful to reasoning about expected value than knowing that there is a red or green piece of paper in either envelope.


Approach 2: Keeping the quantities symbolic.

There are two envelops with quantities x and y inside. We are told that 2x = y. After choosing it is revealed that our envelope has quantity z. It is not revealed if z=x or z=y.

Let's consider the universe of possibilities.

Possibility one (50% chance): z = y. Value of switching: -x

Possibility two (50% chance): z = x. Value of switching: +x

Therefore the value or switching is: .5 * -x + .5* +x = +/-0

The red herring here is that knowing value z feels like it is information about values x and y, but it isn't.

The slight of hand is in trying to say that the other envelope is worth either .5z or 2z. This is false because there is an unknown but fixed universal constant variable x. We don't know from the available information if we are in universe z=x or universe z=y.

In short: The two envelopes paradox mistakes an unknown constant for an unknown variable. Knowing that z = 20 doesn't change the universal constants x and y.

Approach 1: There is in fact new information when you look into the envelope it's also very valuable because it allows you make judgement taking into account your knowledge about the world and specific situation (who puts money in the envelopes, what are general preferences of people in such situations etc.).

Approach 2: You made the same mistake. Seeing the money is actually valuable and very real information. The problem is that original reasoning makes wrong use of it. It doesn't mean there is no information or that we can/should ignore it.

If the new information is relevant, how is this simulation code wrong? https://gist.github.com/tedtieken/6567112

The code is wrong because there is an assumption that the distribution is 50% for 10$ and 50% for 20$. There no basis for this (how do you know you won't got 40$ if you see 20$ ?). See my other posts, I think this point is very well worth thinking about.

The paradox depends on the possibility of always being able to double the amount. This isn't true.

Swapping once will either double or half the amount; swapping back will just do the opposite.

Even better: if I put $10 into one envelope, and $20 in another envelope and get you to choose one, what is the probability that the other envelope contains $40? Zero.

Thanks, that is a wonderful way to put it. The swaps are not independent in their effects. That satisfies me.

This is a fantastic and subtle paradox, and not at all taken to quick resolution. A resolution follows, first to spot a problem (if there is one) gets a cookie.

Consider the amount in the lower to be f(x), the higher to be x, given f(z) such that f(z) < z for all z. In this way we generalize to all distributions, as the problem clearly applies to all distributions. You open the first envelope, which contains A. The second envelope contains B. The challenge is to calculate the value of B.

First we must calculate x. The chance that A = f(x) is 1/2, the chance that A = x is 1/2. The average value of A is (x + f(x))/2, so the average value of x is [f+I]^{-1}(2A), where I is the identity function and [g]^{-1} denotes an inverse function.

Now we calculate B. The trick lies in this: both calculations must lie in the same reference frame. So B = x with probability 1/2, and B = f(x) with probability 1/2, giving us B = (x + f(x))/2.

The rest is plug-and-chug: B = ([f+I]^{-1}(2A) + f([f+I]^{-1}(2A))/2 --> B = [f+I]([f+I]^{-1}(2A))/2 --> B = 2A/2 thus B = A.

Therefore over all probability distributions that can be defined we have the average value of B equal to A in any case. A purely mathematical resolution is satisfying, but I am not in any case an epistemologist, so it may not satisfy people who take a different interpretation of math than me. It works, and I like it.

The problem here is that you are adding together two values for a specific case of x and then adding together the result for all values of x (is that clear? probably not).

It's analogous to another age old problem: what is the value of 1 + (-1) + 1 + (-1) + 1 + (-1) ...

You could argue that (1 + -1) + (1 + -1) ... = 0 + 0 ... = 0.

You could also argue that 1 + (-1 + 1) + (-1 + 1) = 1 + 0 ... = 1.

There are actually way to group the numbers in that series to get any integer answer you want :D

Since we haven't actually defined a distribution over x the problem is not well-defined anyway. But lets pick, say, P(x=2^i) = 1/(2^i) for all i>1.

Then the expected value of A is

    E(A) = (1/2 * $2) + (1/4 * $4) + (1/8 * $8) ...
         = +infinity. 
Similary E(B) = +infinity.

Now for E(B-A)

    E(B-A) = (1/2 * $1) + (1/2 * -$1) + (1/4 * $2) + (1/4 * -$2) ...
Like the example above, we can add this up in different ways:

    E(B-A) = ((1/2 * $1) + (1/2 * -$1)) + ((1/4 * $2) + (1/4 * -$2)) ...
           = $0 + $0 ...
           = $0

    E(B-A) = (1/2 * $1) + ((1/2 * -$1) + (1/4 * $2)) + ((1/4 * -$2) ...
           = $0.5 + $0 + $0 ...
           = $0.5
When adding up infinite numbers of things you have to be very careful. I go into more detail in this thread - https://news.ycombinator.com/item?id=6387344

Cookie please :D

In my original version of the post I had intended to argue for a geometric mean, which was sort of a joke, as it only works where f(x) is a constant k*x. So the statement "A false resolution follows, first to spot the problem..." appears in the post. Upon seeing this an enterprising and intelligent person may have clicked "reply"

While writing this I realized that inverting the definition of A gave a different resolution based loosely on operator theory, and posted that instead. A couple minutes later I realized the mistake and changed the beginning to refer to the first to spot "a" problem.

The argument you've presented is good, but you have attacked the well-foundedness of the problem itself rather than the logic I've applied. I'm not sure if you get the cookie.

Goddamn, what does it take to get a cookie around here? :D

I read through the common resolution but I just can't get why you need all this. The problem seems to be at the start: you select one of two amounts, then you say the other amount is either double or half what you chose.

That means you have 3 amounts in the equation: 0.5x, x and 2x. But in reality there are only 2 amounts: x and 2x.

So you have to state the problem like: you choose an envelope. The other envelope either contains x or 2x. It can't contain 0.5x because that amount never existed in the first place.

Unless you picked 2x to start with, in which case the other envelope does contain half of what you chose...

i.e. there are two envelopes, x and 2x, and you choose one. There is a 50% chance that the envelope you didn't choose is half the value of the one you picked (you pick 2x, and the other is x), or there's a 50% chance the envelope you didn't choose is twice the value of the one you picked (you pick x, and the other is 2x). I think it's misleading to introduce this "third" quantity of 0.5x... but perhaps I'm missing something subtle, in which case I'd be happy to be corrected!

I think it's misleading to introduce this "third" quantity of 0.5x

Yes, it is misleading, that's my point :)

Put it another way: you have two quantities in two envelopes, A and B. If you choose B, what is the probability that the other envelope contains a 3rd quantity, C? The probably of that is zero, because C was not present in the initial two envelopes.

Still another way of putting it: I will place $10 in an envelope, and $20 in another envelope, then ask you to choose one. What are the chances that the other envelope contains $40? The chance of that is zero, because the only two amounts present at the start of the trick were $10 and $20.

Arbitrarily label the two envelopes A and B. Assume that no envelope is empty. Let «X» denote the amount of money in envelope X.

We are told that either «A» = 2«B» or «B» = 2«A». Because these propositions are exhaustive and mutually exclusive, we know that

    P(«A» = 2«B») + P(«B» = 2«A») = 1.      (1)
Since we have no knowledge that would make either proposition more plausible than the other, we also have (by symmetry) that

    P(«A» = 2«B») = P(«B» = 2«A»).          (2)
From (1) and (2) we can solve for the individual probabilities of the propositions:

    P(«A» = 2«B») = P(«B» = 2«A») = 1/2.    (3)
Therefore, from the initial conditions, we have no reason to prefer either envelope.

Now we are given the information that we have picked one of the envelopes at random (let's say it's A). We are further given the information that A contains $20, that is «A» = $20. How does this new knowledge affect the probabilities?

It has no effect because we can't say anything more about either proposition without also knowing «B» as well, and we don't know it. That is,

    P(«A» = 2«B» | «A» = $20) = P(«A» = 2«B»)

    P(«B» = 2«A» | «A» = $20) = P(«B» = 2«A»).
Therefore, our probability assignments from (3) remain unchanged, and we have no reason to prefer one envelope to the other, let alone swap A for B.

This is a really cool problem. I don't claim to understand it, but here's one way of thinking about it. The paradox assumes that when we see the value of money in the envelope we open, we get no information about whether that's the smaller or larger amount of money. Now, in practice, is that possible?

Let X,Y be i.i.d. draws from some distribution with CDF F, and say we see Y=y when we open our envelope. If the distribution satisfies the constraints of the problem, then we have to have that, for all possible values of y: P(X > Y | Y = y) = 1/2 = P(X < Y | Y = y)

In practice, this is impossible. The reason why this is impossible is because P(X>Y | Y = y) = (1-F(y)). Since F(y) is a CDF, if we take the limit as y goes to infinity we have that (1-F(y)) = 1. So (1-F(y)) can't equal 1/2 regardless of y. That is, the probability that the other envelope contains the larger or smaller value is not independent of the value we observe in the envelope we open. And, more importantly, it can't be, such a distribution doesn't exist. You have to consider the sampling distribution to get a meaningful calculation.

This sounds like a special case of the secretary problem http://en.wikipedia.org/wiki/Secretary_problem, where instead of envelopes you swap secretaries and additionally you have a time constraint.

I find it very interesting how these statistical problems can be projected on our own life (swapping jobs, finding a better partner etc ..)

The paradox is interesting to discover and discuss errors in reasoning, but we should agree from the start that any proof that one should always switch (or that any behaviour could increase the expected gain) is flawed.

This can be easily proven by:

1. Taking at face value that choosing an envelope is random. 2. Adding a second player that is given the other envelope.

In that scenario, any reasoning that leads one player to always switch would apply to the other player too. Both players would swap their envelopes. Clearly, the expected value for both cannot go up. Thus any reasoning that mandates switching /has/ to be wrong.

(If you have issue with adding another player, instead assume a parallel universe where the player chose the other envelope.)

The error in reasoning is usually to assume that the envelope contains X and then reasoning about X/2 and 2X. In reality, the envelopes always contain X and 2X. When you actually use the parallel universe version, clearly one has X and the otehr has 2X. Nobody has X/2 (nor 4X, nor any other expected value your theory may come up with).

Many explanations here contain the same mistake: Some reasoning -->> seeing the money doesn't change anything and/or doesn't count as information.

If you arrive at this point you've made a mistake. Seeing the money is valuable information. The trick is not in ignoring it but making good use of it like this:

-use your best knowledge about person who puts money in the envelope, general human tendencies and how the world is to approximate distribution of money in the envelopes

-pick according to your personal utility of money assuming the distribution

What the the wrong reasoning in original puzzle suggests is to assume uniform distribution which is impossible but more importantly baseless. As other poster shows there are baseless distributions you could assume which leads you nowhere as well.

Just don't let it detract from the main point: there is new information but making use of it is not that easy.

If it is possible to switch and gain on the average we could write a 10,000 or so iteration monte-carlo simulation of this switching algorithm to demonstrate the gains.

The envelope has either x or 2x. One envelope has amount z, but the agent doesn't know if z = x or z = 2x.

If it is true that not switching leads to an EV of 3x/2 but switching leads to a higher EV: then a single switch each round should come back with an average value of greater than 3x/2.

In the Montey Hall problem, we can code up such an agent to demonstrate that the naive intuition is wrong. However, in this case, regardless of how many times we switch, we still end up with an average value of 2x/3.

Therefore there is an error in the math that says that switching has a higher EV than staying.

Go ahead and run a few million simulations of the problem and graph the running mean over time. You might be surprised :)

Be sure to use exact arithmetic (eg http://docs.python.org/2/library/fractions.html)


Run the simulator and you'll see, the ev is 3x/2, regardless of switching behavior.

Feel free to modify if you think I've innacurrately conceptualizer the problem.

Also, please excuse some non pythonic names, I'm writing the code on an iPhone.

Sorry, I should have been more specific :S

You have assumed a fixed amount in each envelope. The article leaves the amount unspecified but implicitly assumes that there is no maximum amount that could be in the envelope. The problem only becomes interesting for certain distributions.

Try this code:


If you run it with the uniform distribution the running mean will eventually converge to 0. If you comment that out and uncomment the exponential distribution it is much more interesting :)

With the exponential distribution the expected value of switching does not even exist.

EDIT Doh, the original gist was totally wrong. That'll teach me to argue on the internet at 3am. The updated gist is correct.

Fixed amount vs variable amount is irrelevant, variable amount just requires higher n.

Either way, the gain from switching approaches zero as n approaches infinity.

It does make a difference but I messed up the code :S

If you try the updated gist you will find that the second distribution appears to converge for a while but always jumps away again. I've run it now for 20540000 rounds and its further away than it started.

Err, that last ratio was supposed to be 3x/2 again

Here's how I see it - the important unstated variable is how big a budget does the host of the game have? If you know that you're playing with a billionaire who loves this party trick, and is probably willing to give away a million dollars on this, then the $20 you found in you envelope is probably the smaller value. If your envelope contains $800,000 it's probably the bigger one.

The host has chosen a number randomly between 1-N, and they're not neccessarily going to tell you what N is. But in the long run you'll find that values closest to N/2 are more common than values further from N/2.

However, it says in the problem that you are offered the switch before you open the envelope. So, you can't know if your original choice was $20 or $80,000.

Since you don't know what is in either envelope surely both theoretically contain both x and 2x (al a Schrödinger's cat). Only upon opening an envelope will we know the amount; which one you take or how many times you swap make no difference.

Also swapping, in my understanding, doesn't increase your probability. In the first choice you 1 in 2 chance of gettng the higher amount. In the second choice (the ability to swap, which is fundamentally the same as choosing between the two envelopes) you have a 1 in 2 chance of getting the higher amount.

Correct me if I'm wrong.

Correct me if I'm wrong, but here's my analysis.

There are two envelopes. One contains a value x, and the other has 2x. You pick an envelope at random. Let's call the envelope you choose z. The variable z is a random variable that has an equal chance of being x or 2x, so the expected value of z is 1.5x.

If you switch, there are two possibilities. In one case, you picked x, so switching gains you .5x over the expected value of 1.5x.

In the other case, you picked 2x, so switching loses you .5x from the expected value of 1.5x. So, the expected gain from switching should be zero.

Just take both envelopes and run.

I agree. Plus, both have money. Win win.


    $1500308 500308/1000000 0.500308 never switched
    $1500301 500301/1000000 0.500301 always switched


This reminds me of a bizarre version of the Monty Hall problem: http://en.wikipedia.org/wiki/Monty_Hall_problem

the flaw here is assuming that the value being higher on average in the envelope is a sufficient criteria to make swapping the best choice. this is not how probability works, if i have a million empty envelopes and one with a billion dollars in it, the average value of each envelope being £1000 doesn't mean anything in the face of the 1 in a million chance of actually picking the right one.

Actually, that's exactly how expected value works.

I think the point he is getting at is that expected value (or even expected utility) is not actually a good measure of human preferences.

Additionally, in this problem there are certain distributions over the amounts in the envelope for which the expected value is not defined, making it an interesting problem to study.

http://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow is a good book exploring mathematical models for decision making.

Just read it... Here is what I came up with. I start work at 9:00 am, before work I always go to my favorite coffee bar called "Muazam's Coffee" it's 10 minutes walk from my workplace. So I arrive there 8:30 am to have a morning mocha with breakfast while reading the newspaper. One day a man approached me right before the coffee bars entrance and asked me if he could experiment a problem on me, he promised I would get enough cash for a coffee.

Well, why not I thought. He presented me Two Envelopes and said "One contains twice as much as the other. You may pick one envelope and keep the money it contains. You pick at random, but before you open the envelope, you are offered the chance to take the other envelope instead."

Without giving it a single thought, I picked the one on left for me, and began to unfold it. He looked at me and said "don't you want to swap?!", I had just unfolded the envelope and pulled out $20 bucks, "too late for that I guess" I said. I could tell by the look of his face that he was disappointed, he unfolded the other envelope and pulled out $10 bucks. He left me without saying a word.

I went to the coffee bar and spent the money I've just gotten. I took my coffee and breakfast and sat down on one of the outside tables. I saw the Envelop man again, this time he had found another guy. He had two new envelopes and asked the guy the same question. But this guy took out a pen and paper and started doing some math. It was already 8:47, I had to go to work.

Next day, same routine. I go to the Muazam's Coffee bar, and I see Envelope and mathematician guy(let's call him Joe) sitting on one of the tables outside the bar. The guy is still trying to solve the problem. I was kinda impressed how much Joe had put effort into this, he already had four A4 full of notes that seemed like formulas.

This went for weeks, months, years.

Everyday, before work I would see them outside. The Envelope man having two envelopes by his side and Joe trying to solve the problem.

28 years had passed...

I was going to Muazam's Coffee bar as usual when I noticed there was ambulance outside. I went to see who they came for, for my surprise it was Joe. I asked the envelope man, what had happened, he told me Joe had a heart attack.

Joe was admitted in a hospital, he was in bad condition. The envelope man visited him and said "Joe, I am proud of you. We have spent so much time together and you still don't have the perfect formula to solve this problem. Maybe this problem can't be solved with math..".

Joe looked him in the eye and said "No, I won't give up. I will get this right and get the highest amount of cash". The envelope man said "You can have them both" and he unfolded both of the envelopes. One containing $1 and the other one $2. This is where Joe had another heart attack.

I'd like to offer my own humble solution. Maybe it's flawed. Maybe you can embarass me. :)

Here it is: the goal is to choose a strategy which statistically maximizes our return. I.e. strategy A is superior to strategy B if it yields higher returns after, say, 1,000,000 iterations.

So there are two envelopes, X and 2X. You select one, then you're offered a chance to change your selection. What do you do?

Let's write out all the possibilities:

You select X, then you choose to stay, and wind up with X.

You select 2X, then you choose to stay, and wind up with 2X.

You select X, then you choose to swap, and wind up with 2X.

You select 2X, then you choose to swap, and wind up with X.

Those are the only four possibilities. You're forced to choose one of these possibilities randomly, because you have no information to guide your choice. Since two of them yield 2X and two yield X, and since your choice is necessarily random, then therefore all strategies will converge on the same expected value. In short, it doesn't matter what you do. You always have a 50% chance of X or 2X, regardless of your sequence of choices.

At first glance this is similar to the Monty hall problem, but the critical difference is that information is revealed during the Monty hall problem. No extra info is revealed here.

I assert that the envelopes could contain X and 1000X and it still doesn't matter what you do.

Ok, go, embarass me!

No. If I understand correctly, your last two outcomes are wrong. There are 6 possible outcomes, not four. This is because when you choose the 2X correctly, you may receive 2(2X) or X.

The 6 outcomes are:

You select X, then you choose to stay, and wind up with X.

You select 2X, then you choose to stay, and wind up with 2X.

You select X, then choose to swap and wind up with 1/2X

You select X, them choose to swap and wind up with 2x

You select 2X, then choose to swap and wind up with X

You select 2X, then choose to swap and wind up with 4X

Actually, you always wind up with X or 2X, never 0.5X and never 4X. Remember, the envelopes contain either X or 2X, and you always end up with either one or the other.

I myself understood the problem with Afforess's reply. You are explicitly pinning the value in the envelopes, but remember the other envelope has either half or double of the value than the one in your hand, and this is the crux of this paradox. Hopefully, I am thinking in correct terms.

But he has assigned the identifiers "X" and "2X" to the envelopes. He could have called them "A" and "B" and his logic still holds. The problem I see with the wiki article explanation is: while using actual dollar amounts as examples, it gives the impression that you get to open the envelope that you picked first to see the amount, but that you don't know if it's the smaller value or larger value.

So I choose "A" and I don't know if it contains $X or $2X. I don't even get to open it to know what amount is in the envelope. Whatever it contains, if I choose to switch, I get envelope "B" - I always have a 50% chance of choosing the larger amount or switching to the envelope with the larger amount because no information is revealed after the first choice.

for A->$X, B->$2X:

    Choose A, stay with A, receive $X
    Choose A, switch to B, receive $2X
    Choose B, switch to A, receive $X
    Choose A, stay with B, receive $2X
for A->$2X, B->$X:

    Choose A, stay with A, receive $2X
    Choose A, switch to B, receive $X
    Choose B, switch to A, receive $2X
    Choose A, stay with B, receive $X
50% chance of getting either amount, regardless.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact