
Two envelopes problem - tchajed
https://en.wikipedia.org/wiki/Two_envelope_problem
======
jamii
You can even explicitly define the distribution as P($2^i) = 1/(2^i) and the
'paradox' remains. The problem is that you are reasoning about the expected
gain of swapping but that expectation is a sum over a series which is not
absolutely convergent
([http://en.wikipedia.org/wiki/Absolute_convergence](http://en.wikipedia.org/wiki/Absolute_convergence))
so it's value depends on the order in which you sum over the different cases.

In fact, if you repeatedly simulate the problem (make sure you use bignums)
you will find that the mean gain from swapping is not well behaved at all and
refuses to converge. The law of large numbers only applies if the expectation
is well-defined.

The lesson here is: when in doubt, explicitly write out the probability space
over which you are working. Problems like this and the Monty Hall problem are
trivially solvable on paper. There is a reason that mathematicians get all hot
and bothered about formalism - it gives you a solid base from which to build
_correct_ intuitions.

EDIT Let's write this down properly.

    
    
        i | gain from swapping if I have the smallest | gain from swapping if I have the largest
        1 $2 -$2
        2 $4 -$4
        3 $8 -$8
        etc
    

There are two arguments.

The first is that the situation is symmetric so you can't possibly gain. That
is:

    
    
        E(Gain) = (1/4 * $2 + 1/4 * -$2) + (1/8 * $4 + 1/8 * -$4) ...
                = $0 + $0 + $0 + $0 ...
                = $0
    

The second argument is that swapping from small to large is a bigger gain than
the loss of swapping from large to small. That is:

    
    
        E(Gain) = (1/4 * $2) + (1/4 * -$2 + 1/8 * $4) + (1/8 * -$4 + 1/16 * $8) ...
                = $0.5 + $0 + $0 + $0 ...
                = $0.5
    

Adding up an infinite series is tricky :)

~~~
bluecalm
Well, yeah :) The problem is still in assuming something about the
distribution without having any good reason to. In original "paradox"
reasoning it was impossible uniform distribution. What you show is that even
assuming possible distribution could still lead nowhere. It still doesn't mean
there is any reason to assume it's P($2^i) = 1/(2^i). I may just as well
assume it's exactly 100$-200$ and that switching from 100$ gives me guaranteed
payoff. It would be as baseless as assuming uniform distribution or the one
given by you.

I think the main lesson from this puzzle is that you can't assume stuff
without good reason to.

~~~
jamii
I'm not _assuming_ that that is the distribution. I'm saying that the problem
as presented is underspecified, but even if you give this specific
distribution in the problem you can still cause confusion. Assuming stuff is
bad but the original 'paradox' still doesn't go away if you nail everything
down. It's a useful problem for education people about the subtleties of
infinite sums.

~~~
bluecalm
Well, there are distributions which don't have well defined expected value and
I agree that it's valuable lesson. I would argue though that this is not the
best example to show it. The point of two envelope problem comes much earlier
and specifically giving distribution defined by you would make it completely
different problem and assuming it in original form would already be a mistake.

I think there are better examples to show how infinite sums and relying on
expected value based on those might leads to problems. Like this one for
example:

[http://plato.stanford.edu/entries/paradox-
stpetersburg/](http://plato.stanford.edu/entries/paradox-stpetersburg/)

~~~
jamii
The expected value in the St Petersburg game is actually well-defined - it
converges to positive infinity. That is a subtly different result than not
converging at all. Both problems are useful.

------
millstone
This is one of my favorite problems. Here is one variation can help clarify
thinking about it:

"There's two envelopes. One contains twice as much money as the other. No
envelope contains more than $N."

This changes the problem dramatically. If your envelope contains more than
$N/2, of course you should not switch: the other envelope necessarily contains
less. If it contains $N/2 or less, perhaps you should switch: the other
envelope may or may not contain more.

Say your envelope contains $x, and you don't know what $N is. There's two
possibilities:

1\. $x <= $N/2\. If you switch, your expectation value is $3x/2, by the
original argument.

2\. $x > $N/2\. If you switch, you will get $x/2, since no envelope contains
more than $N.

If we assume the distribution is uniform on the range [0, N], then these
possibilities are equally likely. Therefore the total expectation value is the
average of the EVs of the two possibilities (3x/2 and x/2), which is $x. So we
recovered the naive expectation of "it doesn't matter" from this variation.
Now we can take the limit as $N goes to infinity, and while the EV of $x
approaches infinity, the fact that switching does not matter does not change.

As others have said, the underlying problem is the assumption that a uniform
probability on an infinite set makes sense, which it does not. However, we can
instead take the limit for finite sets, in which case we recover the intuitive
result that switching does not matter.

~~~
jamii
Exchanging limits and expectations is not always valid (eg
[http://www.stanford.edu/class/msande322/docs/app_B.pdf‎](http://www.stanford.edu/class/msande322/docs/app_B.pdf‎)).
In this case the expected gain is not convergent in the limit. See
[https://news.ycombinator.com/item?id=6387344](https://news.ycombinator.com/item?id=6387344)

This is one of my favourite problems too :)

------
EllaMentry
That is a very long article based on flawed argument.

Given no other information, assuming someone gave you 2 envelopes and told you
one has $40 vs $20, common sense dictates choose 1 randomly and walk away -
with no other information it is illogical to reason any other way.

The chance you choose the lower value is 1/2.

Now, if you are allowed to look inside the envelope (which gets introduced
further down) then it becomes a different game.

Get $20...well by swapping you may get $10 or $40...you should probably swap.

Get $2000...well by swapping you may get $1000 or $4000...you should probably
swap.

I think this works all the way up...someone with a bit more background on game
theory may be able to formalise it, but the realisation that swapping forever
leads to $0 nullifies this "paradox"

~~~
bluecalm
Yes, it's not a paradox it's just seductive flawed reasoning. Yes, at any
point EV of picking an envelope at random is 3/4n (n being higher amount of
money out of the two). It is all there is to it. The "paradox" is introduced
by silent assumption that distribution of amounts put in envelopes is uniform
which is impossible (because you can't pick numbers from infite set uniformly
even if there was infinite amount of money in "adversary" disposal). The
assumption is then used for conditional probability calculations: "if we see
10$ there is 50% chance the other envelope contains 20$" \- BEEP, ERROR, THINK
AGAIN.

Perhaps good exercise in clear thinking but not really a paradox. Good analogy
is this: "If we pick random building and climb to the roof of it there is 50%
chance first building we see is higher than the one we just climbed". This is
obviously true, now following "paradoxical" reasoning we get: "If we climb a
building randomly and see it's the Empire State Building there is still 50%
chance first building we see will be higher".

This is exact analogy to reasoning about 2 envelopes problem which is supposed
to lead to a paradox.

~~~
jamii
You can explicitly state the distribution and still run into the same problem:
[https://news.ycombinator.com/item?id=6387344](https://news.ycombinator.com/item?id=6387344)
.

The underlying problem is basically that probability theory in non-finite
spaces has some gotchas - one of which is that the expectation of a random
variable does not always exist.

~~~
bluecalm
Interesting point and nice read. Still the problem is in assumption about
underlying distribution of amounts in envelopes (in original case impossible
uniform distribution). The reasoning is based on this assumption and leads to
nonsense. What you are saying (I think) is that assuming some other
distribution (possible one, instead of impossible one) could still lead to
nonsense or doesn't lead anywhere at all.

~~~
jamii
Not so much that it leads to nonsense as that naively applying expectations
doesn't always work. This is a contrived example, but it's not uncommon in eg
random walk theory to hit upon cases like this where the expectation does not
exist at all.

People commonly think of mathematics as being purely about formal proof but
the reality is an interplay between proof and intuition. Usually when a
mathematician encounters a problem in a familiar area they immediately know
the answer by intuition which then guides the production of a correct proof.
When you first enter a new area of mathematics your intuitions are all
completely wrong and you have no idea where to start with a proof. Good
teachers will introduce edge cases like this problem to refine your intuition
until it is useful enough to be a guide.

[http://terrytao.wordpress.com/career-
advice/there%E2%80%99s-...](http://terrytao.wordpress.com/career-
advice/there%E2%80%99s-more-to-mathematics-than-rigour-and-proofs/)

------
sillysaurus2
I'd like to offer my own humble solution. Maybe it's flawed. Maybe you can
embarass me. :)

Here it is: the goal is to choose a strategy which statistically maximizes our
return. I.e. strategy A is superior to strategy B if it yields higher returns
after, say, 1,000,000 iterations.

So there are two envelopes, X and 2X. You select one, then you're offered a
chance to change your selection. What do you do?

Let's write out all the possibilities:

You select X, then you choose to stay, and wind up with X.

You select 2X, then you choose to stay, and wind up with 2X.

You select X, then you choose to swap, and wind up with 2X.

You select 2X, then you choose to swap, and wind up with X.

Those are the only four possibilities. You're forced to choose one of these
possibilities randomly, because you have no information to guide your choice.
Since two of them yield 2X and two yield X, and since your choice is
necessarily random, then therefore all strategies will converge on the same
expected value. In short, it doesn't matter what you do. You always have a 50%
chance of X or 2X, regardless of your sequence of choices.

At first glance this is similar to the Monty hall problem, but the critical
difference is that information is revealed during the Monty hall problem. No
extra info is revealed here.

I assert that the envelopes could contain X and 1000X and it still doesn't
matter what you do.

Ok, go, embarass me!

~~~
Afforess
No. If I understand correctly, your last two outcomes are wrong. There are 6
possible outcomes, not four. This is because when you choose the 2X correctly,
you may receive 2(2X) or X.

The 6 outcomes are:

You select X, then you choose to stay, and wind up with X.

You select 2X, then you choose to stay, and wind up with 2X.

You select X, then choose to swap and wind up with 1/2X

You select X, them choose to swap and wind up with 2x

You select 2X, then choose to swap and wind up with X

You select 2X, then choose to swap and wind up with 4X

~~~
sillysaurus2
Actually, you always wind up with X or 2X, never 0.5X and never 4X. Remember,
the envelopes contain either X or 2X, and you always end up with either one or
the other.

~~~
iamshs
I myself understood the problem with Afforess's reply. You are explicitly
pinning the value in the envelopes, but remember the other envelope has either
half or double of the value than the one in your hand, and this is the crux of
this paradox. Hopefully, I am thinking in correct terms.

~~~
delinka
But he has assigned the identifiers "X" and "2X" to the envelopes. He could
have called them "A" and "B" and his logic still holds. The problem I see with
the wiki article explanation is: while using actual dollar amounts as
examples, it gives the impression that you get to open the envelope that you
picked first to see the amount, but that you don't know if it's the smaller
value or larger value.

So I choose "A" and I don't know if it contains $X or $2X. I don't even get to
open it to know what amount is in the envelope. Whatever it contains, if I
choose to switch, I get envelope "B" \- I always have a 50% chance of choosing
the larger amount or switching to the envelope with the larger amount because
no information is revealed after the first choice.

for A->$X, B->$2X:

    
    
        Choose A, stay with A, receive $X
        Choose A, switch to B, receive $2X
        Choose B, switch to A, receive $X
        Choose A, stay with B, receive $2X
    

for A->$2X, B->$X:

    
    
        Choose A, stay with A, receive $2X
        Choose A, switch to B, receive $X
        Choose B, switch to A, receive $2X
        Choose A, stay with B, receive $X
    

50% chance of getting either amount, regardless.

------
aegiso
The way I see it the flaw is in the first sentence of the "example":

> Assume the amount in my selected envelope is $20.

You can't just pull that assumption out of your ass. You can only assume what
the problem states, which is that the values in the envelopes are X and 2X and
you had a 50% chance of choosing either.

If you run the math without adding assumptions, it works out that swapping
makes no difference, statistically.

Though it is a very clever trick :).

~~~
DoctorZeus
Taking out the specific dollar amount doesn't change the math at all. If X
designates the amount in the envelope I've selected, than 50% chance the other
envelope contains .5 * X and a 50% chance it contains 2 * X, so the expected
value of the other envelope is .5 * .5 * X + .5 * 2 * X = 1.25 * X which is
greater than X.

~~~
dnautics
but you are implicitly turning the scenario into one with three values, 1/2X,
X, and 2X, so something went wrong with what you're doing. There was only ever
a universe of two values.

~~~
DoctorZeus
Sure, these are variables, so they correspond to different possibilities. X
can be anything, and given X, the amount in the other envelope is one of two
different possible values - 1/2X OR 2X.

So there are far more than three _possible_ values, but only two _actual_
values.

~~~
dnautics
Exactly, that's why your model is broken.

------
fargolime
All the probability math therein, for a problem whose solution is highly
intuitive (if you swap, you'd be just as inclined to swap envelopes
indefinitely, is all you need to realize), reminds me of this quote:

"The intuitive mind is a sacred gift and the rational mind is a faithful
servant. We have created a society that honors the servant and has forgotten
the gift." \- Albert Einstein

Shameless plug for a blog I like that has little math, yet uses much intuition
to solve five of the biggest outstanding problems of physics:
[http://finbot.wordpress.com](http://finbot.wordpress.com)

~~~
justinsb
Theoretical resolution of "intuitively obvious" paradoxes such as these are
important.

If we cannot find a theoretical resolution, it can indicate a flaw in our
theories, and that will likely provide a more accurate set of theories.

The problems around the speed of light gave rise to Einstein's theories of
relativity. Another of those flaws ("the set of all sets that are not members
of themselves") gave rise to modern set theory and formal logic.

Still a nice quote from Einstein though :-)

------
mgraczyk
I don't see the paradox... If you have 2x and swap you lose x. If you have x
and swap, you gain x. 0.5 _(-x) + 0.5_ (x) = 0, So you should be indifferent
to swapping.

A I missing something? Not to say I don't make mistakes, but I have a BS in
mathematics so maybe this is only obvious for people with a background in
math?

EDIT: No need for dollar values.

~~~
oneeyedpigeon
I don't have a BS in maths, but the important bit, as I read it, is that you
don't know the values involved. I.e. if you have 20, there is _either_ 10 in
the other envelope, or 40. You have no way of knowing which is the case, so
it's in your interest to swap since the benefits outweigh the risks.

This is totally counterintuitive, though, so I'm fully willing to accept I'm
missing something! And I don't buy the 'indefinite swapping' argument, since
the second swap must surely reverse any advantage gained.

~~~
andrewflnr
Once you have the second envelope in hand, since you don't actually know
what's in it, the exact same expected-value argument applies to switching
back: there might be $10 or $40 in the other (first) envelope, "so it's in
your interest to swap since the benefits outweigh the risk".

While the second swap would reverse any advantage _possibly_ gained, it would
also reverse any _possible_ harm sustained You dont know. The situation is
perfectly symmetric; picking and switching is the same as just picking the
second one at first. So just pick one.

~~~
colomon
Env A has $20 (we have it and know). Env B has $10 or $40 (equal chance).

We swap. Now we hold Env B, which has $10 or $40 in it. We know the other
envelope (Env A) is $20. Why would we switch again?

~~~
andrewflnr
The original statement of the problem says you're offered the choice before
you open the envelope. I guess I was sloppy about the $20 assumption too. As
for the case you do know, I still feel wrong about the conclusion that you
should switch, but I can't formalize it.

------
thatthatis
Here's my contribution:

\-----------------

Approach 1: Absent new information, we cannot improve our outcomes.

In the montey hall problem there is either obscure new information, or a
obscure change in the rules between firs choice and second choice.

Montey hall collapses to an initial choice of the prize behind door A or the
prizes behind both door B and C. When the true, collapsed, choice is revealed
the common sense reasoning is correct.

In this problem there is no new usable information.

In the two envelopes problem the new information appears relevant but is
actually not on it's own any more useful to reasoning about expected value
than knowing that there is a red or green piece of paper in either envelope.

\-----------------

Approach 2: Keeping the quantities symbolic.

There are two envelops with quantities x and y inside. We are told that 2x =
y. After choosing it is revealed that our envelope has quantity z. It is not
revealed if z=x or z=y.

Let's consider the universe of possibilities.

Possibility one (50% chance): z = y. Value of switching: -x

Possibility two (50% chance): z = x. Value of switching: +x

Therefore the value or switching is: .5 * -x + .5* +x = +/-0

The red herring here is that knowing value z feels like it is information
about values x and y, but it isn't.

The slight of hand is in trying to say that the other envelope is worth either
.5z or 2z. This is false because there is an unknown but fixed universal
constant variable x. We don't know from the available information if we are in
universe z=x or universe z=y.

In short: The two envelopes paradox mistakes an unknown constant for an
unknown variable. Knowing that z = 20 doesn't change the universal constants x
and y.

~~~
bluecalm
Approach 1: There is in fact new information when you look into the envelope
it's also very valuable because it allows you make judgement taking into
account your knowledge about the world and specific situation (who puts money
in the envelopes, what are general preferences of people in such situations
etc.).

Approach 2: You made the same mistake. Seeing the money is actually valuable
and very real information. The problem is that original reasoning makes wrong
use of it. It doesn't mean there is no information or that we can/should
ignore it.

~~~
thatthatis
If the new information is relevant, how is this simulation code wrong?
[https://gist.github.com/tedtieken/6567112](https://gist.github.com/tedtieken/6567112)

~~~
bluecalm
The code is wrong because there is an assumption that the distribution is 50%
for 10$ and 50% for 20$. There no basis for this (how do you know you won't
got 40$ if you see 20$ ?). See my other posts, I think this point is very well
worth thinking about.

------
jader201
The paradox depends on the possibility of always being able to double the
amount. This isn't true.

Swapping once will either double or half the amount; swapping back will just
do the opposite.

~~~
dools
Even better: if I put $10 into one envelope, and $20 in another envelope and
get you to choose one, what is the probability that the other envelope
contains $40? Zero.

------
scythe
This is a fantastic and subtle paradox, and not at all taken to quick
resolution. A resolution follows, first to spot a problem (if there is one)
gets a cookie.

Consider the amount in the lower to be f(x), the higher to be x, given f(z)
such that f(z) < z for all z. In this way we generalize to _all_
distributions, as the problem clearly applies to _all_ distributions. You open
the first envelope, which contains A. The second envelope contains B. The
challenge is to calculate the value of B.

First we must calculate x. The chance that A = f(x) is 1/2, the chance that A
= x is 1/2\. The average value of A is (x + f(x))/2, so the average value of x
is [f+I]^{-1}(2A), where I is the identity function and [g]^{-1} denotes an
inverse function.

Now we calculate B. The trick lies in this: _both calculations must lie in the
same reference frame_. So B = x with probability 1/2, and B = f(x) with
probability 1/2, giving us B = (x + f(x))/2.

The rest is plug-and-chug: B = ([f+I]^{-1}(2A) + f([f+I]^{-1}(2A))/2 --> B =
[f+I]([f+I]^{-1}(2A))/2 --> B = 2A/2 thus B = A.

Therefore over all probability distributions that can be defined we have the
average value of B equal to A in any case. A purely mathematical resolution is
satisfying, but I am not in any case an epistemologist, so it may not satisfy
people who take a different interpretation of math than me. It works, and I
like it.

~~~
jamii
The problem here is that you are adding together two values for a specific
case of x and then adding together the result for all values of x (is that
clear? probably not).

It's analogous to another age old problem: what is the value of 1 + (-1) + 1 +
(-1) + 1 + (-1) ...

You could argue that (1 + -1) + (1 + -1) ... = 0 + 0 ... = 0.

You could also argue that 1 + (-1 + 1) + (-1 + 1) = 1 + 0 ... = 1.

There are actually way to group the numbers in that series to get any integer
answer you want :D

Since we haven't actually defined a distribution over x the problem is not
well-defined anyway. But lets pick, say, P(x=2^i) = 1/(2^i) for all i>1.

Then the expected value of A is

    
    
        E(A) = (1/2 * $2) + (1/4 * $4) + (1/8 * $8) ...
             = +infinity. 
    

Similary E(B) = +infinity.

Now for E(B-A)

    
    
        E(B-A) = (1/2 * $1) + (1/2 * -$1) + (1/4 * $2) + (1/4 * -$2) ...
    

Like the example above, we can add this up in different ways:

    
    
        E(B-A) = ((1/2 * $1) + (1/2 * -$1)) + ((1/4 * $2) + (1/4 * -$2)) ...
               = $0 + $0 ...
               = $0
    

Or

    
    
        E(B-A) = (1/2 * $1) + ((1/2 * -$1) + (1/4 * $2)) + ((1/4 * -$2) ...
               = $0.5 + $0 + $0 ...
               = $0.5
    

When adding up infinite numbers of things you have to be very careful. I go
into more detail in this thread -
[https://news.ycombinator.com/item?id=6387344](https://news.ycombinator.com/item?id=6387344)

Cookie please :D

~~~
scythe
In my original version of the post I had intended to argue for a geometric
mean, which was sort of a joke, as it only works where f(x) is a constant k*x.
So the statement "A false resolution follows, first to spot the problem..."
appears in the post. Upon seeing this an enterprising and intelligent person
may have clicked "reply"

While writing this I realized that inverting the definition of A gave a
different resolution based loosely on operator theory, and posted that
instead. A couple minutes later I realized the mistake and changed the
beginning to refer to the first to spot "a" problem.

The argument you've presented is good, but you have attacked the well-
foundedness of the problem itself rather than the logic I've applied. I'm not
sure if you get the cookie.

~~~
jamii
Goddamn, what does it take to get a cookie around here? :D

------
dools
I read through the common resolution but I just can't get why you need all
this. The problem seems to be at the start: you select one of two amounts,
then you say the other amount is either double or half what you _chose_.

That means you have 3 amounts in the equation: 0.5x, x and 2x. But in reality
there are only 2 amounts: x and 2x.

So you have to state the problem like: you choose an envelope. The other
envelope either contains x or 2x. It can't contain 0.5x because that amount
never existed in the first place.

~~~
Osmium
Unless you picked 2x to start with, in which case the other envelope does
contain half of what you chose...

i.e. there are two envelopes, x and 2x, and you choose one. There is a 50%
chance that the envelope you didn't choose is half the value of the one you
picked (you pick 2x, and the other is x), or there's a 50% chance the envelope
you didn't choose is twice the value of the one you picked (you pick x, and
the other is 2x). I think it's misleading to introduce this "third" quantity
of 0.5x... but perhaps I'm missing something subtle, in which case I'd be
happy to be corrected!

~~~
dools
_I think it 's misleading to introduce this "third" quantity of 0.5x_

Yes, it _is_ misleading, that's my point :)

Put it another way: you have two quantities in two envelopes, A and B. If you
choose B, what is the probability that the other envelope contains a 3rd
quantity, C? The probably of that is zero, because C was not present in the
initial two envelopes.

Still another way of putting it: I will place $10 in an envelope, and $20 in
another envelope, then ask you to choose one. What are the chances that the
other envelope contains $40? The chance of that is zero, because the only two
amounts present at the start of the trick were $10 and $20.

------
tmoertel
Arbitrarily label the two envelopes A and B. Assume that no envelope is empty.
Let «X» denote the amount of money in envelope X.

We are told that either «A» = 2«B» or «B» = 2«A». Because these propositions
are exhaustive and mutually exclusive, we know that

    
    
        P(«A» = 2«B») + P(«B» = 2«A») = 1.      (1)
    

Since we have no knowledge that would make either proposition more plausible
than the other, we also have (by symmetry) that

    
    
        P(«A» = 2«B») = P(«B» = 2«A»).          (2)
    

From (1) and (2) we can solve for the individual probabilities of the
propositions:

    
    
        P(«A» = 2«B») = P(«B» = 2«A») = 1/2.    (3)
    

Therefore, from the initial conditions, we have no reason to prefer either
envelope.

Now we are given the information that we have picked one of the envelopes at
random (let's say it's A). We are further given the information that A
contains $20, that is «A» = $20. How does this new knowledge affect the
probabilities?

It has no effect because we can't say anything more about either proposition
without also knowing «B» as well, and we don't know it. That is,

    
    
        P(«A» = 2«B» | «A» = $20) = P(«A» = 2«B»)
    

and

    
    
        P(«B» = 2«A» | «A» = $20) = P(«B» = 2«A»).
    

Therefore, our probability assignments from (3) remain unchanged, and we have
no reason to prefer one envelope to the other, let alone swap A for B.

------
egreif1
This is a really cool problem. I don't claim to understand it, but here's one
way of thinking about it. The paradox assumes that when we see the value of
money in the envelope we open, we get no information about whether that's the
smaller or larger amount of money. Now, in practice, is that possible?

Let X,Y be i.i.d. draws from some distribution with CDF F, and say we see Y=y
when we open our envelope. If the distribution satisfies the constraints of
the problem, then we have to have that, for all possible values of y: P(X > Y
| Y = y) = 1/2 = P(X < Y | Y = y)

In practice, this is impossible. The reason why this is impossible is because
P(X>Y | Y = y) = (1-F(y)). Since F(y) is a CDF, if we take the limit as y goes
to infinity we have that (1-F(y)) = 1. So (1-F(y)) can't equal 1/2 regardless
of y. That is, the probability that the other envelope contains the larger or
smaller value is not independent of the value we observe in the envelope we
open. And, more importantly, it can't be, such a distribution doesn't exist.
You have to consider the sampling distribution to get a meaningful
calculation.

------
level09
This sounds like a special case of the secretary problem
[http://en.wikipedia.org/wiki/Secretary_problem](http://en.wikipedia.org/wiki/Secretary_problem),
where instead of envelopes you swap secretaries and additionally you have a
time constraint.

I find it very interesting how these statistical problems can be projected on
our own life (swapping jobs, finding a better partner etc ..)

------
pierrebai
The paradox is interesting to discover and discuss errors in reasoning, but we
should agree from the start that any proof that one should always switch (or
that any behaviour could increase the expected gain) is flawed.

This can be easily proven by:

1\. Taking at face value that choosing an envelope is random. 2\. Adding a
second player that is given the other envelope.

In that scenario, any reasoning that leads one player to always switch would
apply to the other player too. Both players would swap their envelopes.
Clearly, the expected value for both cannot go up. Thus any reasoning that
mandates switching /has/ to be wrong.

(If you have issue with adding another player, instead assume a parallel
universe where the player chose the other envelope.)

The error in reasoning is usually to assume that the envelope contains X and
then reasoning about X/2 and 2X. In reality, the envelopes always contain X
and 2X. When you actually use the parallel universe version, clearly one has X
and the otehr has 2X. Nobody has X/2 (nor 4X, nor any other expected value
your theory may come up with).

------
bluecalm
Many explanations here contain the same mistake: Some reasoning -->> seeing
the money doesn't change anything and/or doesn't count as information.

If you arrive at this point you've made a mistake. Seeing the money is
valuable information. The trick is not in ignoring it but making good use of
it like this:

-use your best knowledge about person who puts money in the envelope, general human tendencies and how the world is to approximate distribution of money in the envelopes

-pick according to your personal utility of money assuming the distribution

What the the wrong reasoning in original puzzle suggests is to assume uniform
distribution which is impossible but more importantly baseless. As other
poster shows there are baseless distributions you could assume which leads you
nowhere as well.

Just don't let it detract from the main point: there is new information but
making use of it is not that easy.

------
thatthatis
If it is possible to switch and gain on the average we could write a 10,000 or
so iteration monte-carlo simulation of this switching algorithm to demonstrate
the gains.

The envelope has either x or 2x. One envelope has amount z, but the agent
doesn't know if z = x or z = 2x.

If it is true that not switching leads to an EV of 3x/2 but switching leads to
a higher EV: then a single switch each round should come back with an average
value of greater than 3x/2.

In the Montey Hall problem, we can code up such an agent to demonstrate that
the naive intuition is wrong. However, in this case, regardless of how many
times we switch, we still end up with an average value of 2x/3.

Therefore there is an error in the math that says that switching has a higher
EV than staying.

~~~
jamii
Go ahead and run a few million simulations of the problem and graph the
running mean over time. You might be surprised :)

Be sure to use exact arithmetic (eg
[http://docs.python.org/2/library/fractions.html](http://docs.python.org/2/library/fractions.html))

~~~
thatthatis
[https://gist.github.com/tedtieken/6567112](https://gist.github.com/tedtieken/6567112)

Run the simulator and you'll see, the ev is 3x/2, regardless of switching
behavior.

Feel free to modify if you think I've innacurrately conceptualizer the
problem.

Also, please excuse some non pythonic names, I'm writing the code on an
iPhone.

~~~
jamii
Sorry, I should have been more specific :S

You have assumed a fixed amount in each envelope. The article leaves the
amount unspecified but implicitly assumes that there is no maximum amount that
could be in the envelope. The problem only becomes interesting for certain
distributions.

Try this code:

[https://gist.github.com/jamii/6567205](https://gist.github.com/jamii/6567205)

If you run it with the uniform distribution the running mean will eventually
converge to 0. If you comment that out and uncomment the exponential
distribution it is much more interesting :)

With the exponential distribution the expected value of switching does not
even exist.

EDIT Doh, the original gist was totally wrong. That'll teach me to argue on
the internet at 3am. The updated gist is correct.

~~~
thatthatis
Fixed amount vs variable amount is irrelevant, variable amount just requires
higher n.

Either way, the gain from switching approaches zero as n approaches infinity.

~~~
jamii
It does make a difference but I messed up the code :S

If you try the updated gist you will find that the second distribution appears
to converge for a while but always jumps away again. I've run it now for
20540000 rounds and its further away than it started.

------
cookingrobot
Here's how I see it - the important unstated variable is how big a budget does
the host of the game have? If you know that you're playing with a billionaire
who loves this party trick, and is probably willing to give away a million
dollars on this, then the $20 you found in you envelope is probably the
smaller value. If your envelope contains $800,000 it's probably the bigger
one.

The host has chosen a number randomly between 1-N, and they're not
neccessarily going to tell you what N is. But in the long run you'll find that
values closest to N/2 are more common than values further from N/2.

~~~
russellsprouts
However, it says in the problem that you are offered the switch before you
open the envelope. So, you can't know if your original choice was $20 or
$80,000.

------
heydanreeves
Since you don't know what is in either envelope surely both theoretically
contain both x and 2x (al a Schrödinger's cat). Only upon opening an envelope
will we know the amount; which one you take or how many times you swap make no
difference.

Also swapping, in my understanding, doesn't increase your probability. In the
first choice you 1 in 2 chance of gettng the higher amount. In the second
choice (the ability to swap, which is fundamentally the same as choosing
between the two envelopes) you have a 1 in 2 chance of getting the higher
amount.

Correct me if I'm wrong.

------
russellsprouts
Correct me if I'm wrong, but here's my analysis.

There are two envelopes. One contains a value x, and the other has 2x. You
pick an envelope at random. Let's call the envelope you choose z. The variable
z is a random variable that has an equal chance of being x or 2x, so the
expected value of z is 1.5x.

If you switch, there are two possibilities. In one case, you picked x, so
switching gains you .5x over the expected value of 1.5x.

In the other case, you picked 2x, so switching loses you .5x from the expected
value of 1.5x. So, the expected gain from switching should be zero.

------
asciimo
Just take both envelopes and run.

~~~
eugeneross
I agree. Plus, both have money. Win win.

------
CalvinCarmack
[https://gist.github.com/CalvinCarmack/6567168](https://gist.github.com/CalvinCarmack/6567168)

    
    
        $1500308 500308/1000000 0.500308 never switched
        $1500301 500301/1000000 0.500301 always switched

------
whalemaker
Interesting.

This reminds me of a bizarre version of the Monty Hall problem:
[http://en.wikipedia.org/wiki/Monty_Hall_problem](http://en.wikipedia.org/wiki/Monty_Hall_problem)

------
jheriko
the flaw here is assuming that the value being higher on average in the
envelope is a sufficient criteria to make swapping the best choice. this is
not how probability works, if i have a million empty envelopes and one with a
billion dollars in it, the average value of each envelope being £1000 doesn't
mean anything in the face of the 1 in a million chance of actually picking the
right one.

~~~
harryh
Actually, that's exactly how expected value works.

~~~
jamii
I think the point he is getting at is that expected value (or even expected
utility) is not actually a good measure of human preferences.

Additionally, in this problem there are certain distributions over the amounts
in the envelope for which the expected value is not defined, making it an
interesting problem to study.

[http://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow](http://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow)
is a good book exploring mathematical models for decision making.

------
Mazz
Just read it... Here is what I came up with. I start work at 9:00 am, before
work I always go to my favorite coffee bar called "Muazam's Coffee" it's 10
minutes walk from my workplace. So I arrive there 8:30 am to have a morning
mocha with breakfast while reading the newspaper. One day a man approached me
right before the coffee bars entrance and asked me if he could experiment a
problem on me, he promised I would get enough cash for a coffee.

Well, why not I thought. He presented me Two Envelopes and said "One contains
twice as much as the other. You may pick one envelope and keep the money it
contains. You pick at random, but before you open the envelope, you are
offered the chance to take the other envelope instead."

Without giving it a single thought, I picked the one on left for me, and began
to unfold it. He looked at me and said "don't you want to swap?!", I had just
unfolded the envelope and pulled out $20 bucks, "too late for that I guess" I
said. I could tell by the look of his face that he was disappointed, he
unfolded the other envelope and pulled out $10 bucks. He left me without
saying a word.

I went to the coffee bar and spent the money I've just gotten. I took my
coffee and breakfast and sat down on one of the outside tables. I saw the
Envelop man again, this time he had found another guy. He had two new
envelopes and asked the guy the same question. But this guy took out a pen and
paper and started doing some math. It was already 8:47, I had to go to work.

Next day, same routine. I go to the Muazam's Coffee bar, and I see Envelope
and mathematician guy(let's call him Joe) sitting on one of the tables outside
the bar. The guy is still trying to solve the problem. I was kinda impressed
how much Joe had put effort into this, he already had four A4 full of notes
that seemed like formulas.

This went for weeks, months, years.

Everyday, before work I would see them outside. The Envelope man having two
envelopes by his side and Joe trying to solve the problem.

28 years had passed...

I was going to Muazam's Coffee bar as usual when I noticed there was ambulance
outside. I went to see who they came for, for my surprise it was Joe. I asked
the envelope man, what had happened, he told me Joe had a heart attack.

Joe was admitted in a hospital, he was in bad condition. The envelope man
visited him and said "Joe, I am proud of you. We have spent so much time
together and you still don't have the perfect formula to solve this problem.
Maybe this problem can't be solved with math..".

Joe looked him in the eye and said "No, I won't give up. I will get this right
and get the highest amount of cash". The envelope man said "You can have them
both" and he unfolded both of the envelopes. One containing $1 and the other
one $2. This is where Joe had another heart attack.

