
Seven Puzzles You Think You Must Not Have Heard Correctly (2006) [pdf] - jmount
https://www.math.dartmouth.edu/~pw/solutions.pdf
======
deathanatos
_Love in Kleptopia_ needs to be explained better. The problem can only be
solved if you can afix two padlocks onto a box, and I was presuming the lock
box had a single, normally shaped padlock eye, which would make such a thing
impossible.

I find this happens a lot with "thought" problems: I can't solve it (and can
often prove that) because the rules of the problem are inadequately explained.

~~~
Moodles
Reminds me of this: 1, 2, 3, 4, 5, 6, 7, 8, 9,... What comes next?

10 if it's the sequence of natural numbers

13 if it's the sequence of N such that N=2^n for natural numbers n, where N
does not contain a 0

153 if it's sequence of N such that the the sum of the digits of N each raised
to the power of the number of digits in N equals N.

~~~
pwagland
While I fully agree with the premise that any sequence of numbers can have an
essentially infinite choice of 'next number', depending on how you define your
sequence, and the simplest way to prove that is simply to given a sequence of
_n_ numbers solve the _n+1_ polynomial equation, given that it is incompletely
specified then you have an infinite number of solutions. Or just drop in a
heavy side step function
([https://en.wikipedia.org/wiki/Heaviside_step_function](https://en.wikipedia.org/wiki/Heaviside_step_function)).

However, i'm not sure that I understand your second example.

If N=2^n (assuming 2 to the power of n, not 2 xor n) then one would expect the
sequence to be 1, 2, 4, 8, 16, etc.

If N=2^n using the C xor notation, then we would expect a sequence 3, 0, 1, 6,
7, 4, etc

Your third example is pure evil, and I'm glad that I never had you as a maths
teacher ;-)

~~~
allenz
It's the sequence of n such that 2^n doesn't contain the digit 0. This
excludes 10 (1024), 11 (2048), and 12 (4096). See
[http://oeis.org/A007377](http://oeis.org/A007377)

------
millstone
In the Dot-town suicides, say the stranger says "Alice has a blue dot." Alice
then kills herself. Why would the other residents die?

I think this satisfies the requirements: there's certainly some number of blue
dots for which the statement would be false, namely zero.

~~~
cortesoft
Yeah, I was thinking about if the stranger said something like, “not all the
dots are blue.” This is non-trivial by the definition given, and no one would
have to kill themselves at all.

~~~
uiri
"not all the dots are blue" \- in the case where there are n people with n-1
blue dots and 1 red dot, the person with the red dot kills themself, and then
everyone else kills themselves because they know that the red dot person
killed themself because they did not see any red dots.

~~~
cortesoft
Right, but if there were two red dots the town is fine.

In a one red dot town, you could say “there is at least one blue dot”

~~~
anyfoo
Would they be fine?

So let's say there are two people with red dots, let's name them Ruth and
Rudy. Ruth now knows that "not all dots are blue" (which is equivalent to
"there is at least one red dot"). She sees Rudy with the red dot: Fine, here's
the person with the red dot.

But wait a minute, why is Rudy not killing himself? If Rudy is the only person
with a red dot, he should have seen only blue dots... however he knows there
has to be at least one red dot, so if he only saw blue dots, the red dot must
have been him, so he would have killed himself.

But he didn't, so that means he has seen at least one red dot, which isn't
him, and isn't anybody else Ruth has seen because she only saw blue dots
otherwise.

Oh no, it must be Ruth with the second red dot!

~~~
cortesoft
Ok now it is making sense. The same would be true if you went to three
dots.... the third person would think, “wait, why aren’t the two blue dotted
people killing themselves? There must be a third blue dot... wait, the third
blue dot must be me”

That makes sense. What about the other commenter, who said something like
“Alice has a blue dot.”

Wouldn’t that not lead to everyone’s death?

~~~
Deestan
In your three dots logic, this would have a red dot person kill themselves if
there are two blue dots: The two blue dotted people aren't killing themselves
because they are still thinking "wait, why isn't he killing himself" about the
other.

I feel like some timing procedures need to be defined.

~~~
cortesoft
I think the 'they meet every night' is supposed to be the 'timing procedure'

------
tomp
Another very counterintuitive (for me) problem:

 _how do you do better that break even in the following game: 'A' chooses two
distinct integers, writes them on slips of paper and holds one out in each
hand in a fist. You choose a hand and reveal a number. You must then guess
whether the other number is higher or lower than the revealed one, winning $1
if you guess right and losing $1 otherwise._

~~~
pyk
After writing this one out, this reminds me of the Monty Hall problem. In this
case my guess is that you use a prior -- assume the two unknown numbers are A
& B, and then assume a random integer yourself C.

From there, if A (the first revealed number) is less than C, then that narrows
the remaining cases giving you a 2/3 chance. If A is greater than C, that also
narrows the remaining cases and gives you a 2/3 chance as well.

On a number line, the cases are below.

If A > C then the six originally equally possible cases are narrowed to three
cases:

A-----B-----C (impossible)

A-----C-----B (impossible)

B-----A-----C (impossible)

B-----C-----A B < A

C-----A-----B B > A

C-----B-----A B < A

So you would guess B < A -- the first hand's number is higher with probability
2/3.

If A < C then the six originally equally possible cases are also narrowed to
three cases:

A-----B-----C B > A

A-----C-----B B > A

B-----A-----C B < A

B-----C-----A (impossible)

C-----A-----B (impossible)

C-----B-----A (impossible)

So you would guess B > A -- the first hand's number is lower with probability
2/3.

~~~
OscarCunningham
This works, but you don't get a probability of 2/3, because the cases aren't
equally likely. There isn't a probability distribution for C such that for any
A and B the probability of A<C<B is 1/3\. We would have to have P(0<C<10)=1/3
and P(10<C<20)=1/3 and P(0<C<20)=1/3, which is impossible.

~~~
conorh
Why does this work with probability 2/3 then?

[https://jsfiddle.net/q4qbewp1/](https://jsfiddle.net/q4qbewp1/)

~~~
OscarCunningham
Because of the particular distributions that you happen to have chosen. Alice
doesn't have to pick uniformly at random. Since 0^p=0 and 1^p=1 we can
experiment with different distributions by using "Math.pow(Math.random(),p)"
in place of "Math.random()". For example:

    
    
      var prior = Math.pow(Math.random(),1);
      var hand1 = Math.pow(Math.random(),10);
      var hand2 = Math.pow(Math.random(),10);
      > Win probability: 0.5757182
    

and

    
    
      var prior = Math.pow(Math.random(),1);
      var hand1 = Math.pow(Math.random(),0.1);
      var hand2 = Math.pow(Math.random(),10);
      > Win probability: 0.9026432
    

But the win probability will always be >0.5 so long as the "prior" probability
distribution has a nonzero probability of being between Alice's two numbers.

~~~
jstanley
> the win probability will always be >0.5 so long as the "prior" probability
> distribution has a nonzero probability of being between Alice's two numbers.

Wow. This is the key piece of information that makes the problem interesting,
IMO. That's quite unintuitive.

If you know _anything at all_ about how your opponent chooses numbers, you win
in the long term.

~~~
kd5bjo
It also tells us what Alice's optimal strategy is: pick the first number at
random, and then select an adjacent integer as the second number. Thus,
there's no space between them that your prior can assign any probability to.

~~~
OscarCunningham
Well your strategy has to have some way of breaking ties, when Alice's number
is the same as yours. Lets say that you always say "higher" in that case. Then
you always win whenever your number is between Alice's or equal to the smaller
of them. Equivalently you could just pick a random half-integer.

------
jmblpati
I think there's an easier to visualize solution to the box problem.

Let A have dimensions (a,b,c) and let B have dimensions (x,y,z). Assume A fits
inside B.

We have (a+b+c)^2 = a^2 + b^2 + c^2 + 2ab + 2ac + 2bc. This is the sum of the
A's hypotenuse squared plus its surface area. The same holds for B.

Note that A's hypotenuse is at most that of B-- the hypotenuse of a is its
longest axis, and it needs to fit in B somehow. Further, note that the surface
area of A is less than that of B. To see this, consider the nesting of A
inside B and realize that both boxes' interiors are convex sets. Imagine
inflating A inside of B by taking the sets A_t consisting of all points within
B that are within distance t of a point in A. It is not hard to see that this
inflating operation can only increase the surface area of A, and since the
maximum surface area we can get is that of B we have that A has smaller
surface area than that of B. Thus,

(a+b+c)^2 = (A hypotenuse)^2 + (A surface area) <= (B hypotenuse)^2 + (B
surface area) = (x+y+z)^2. The claim follows.

------
quickthrower2
I loved this. Some comments. Not really spoilers I hope...

For "Unwanted Expansion" the answer is technically correct but I am displeased
that it doesn't prove there wont be any infinite loops. Whereas analyzing
invariant in the tree should prove that.

For Boxes in Boxes it says "But, if we take ε to be huge", but how big is
huge, and what if it isn't huge? Seems like something in the proof is being
hand waved over.

The natives and suicides I enjoyed they really made me go aah!. The irony
about the suicides is that the if the people were too dumb to apply the logic,
they'd survive.

~~~
Sniffnoy
"If we take ε to be huge" just means "let's analyze the growth rate as ε goes
to infinity". More formally, what's going on is that you have a function which
is polynomial in ε and which is always positive; therefore the leading
coefficient must be positive, as if it were negative, the polynomial would be
eventually negative.

The question "what if it isn't huge" makes no sense; we can pick ε, it's not
some external given. (Actually, as mentioned, we're not picking it to be one
specific value but rather letting it tend to infinity, but that's another
matter.)

~~~
quickthrower2
Thanks. Nice explanation. It's been a while (18 years) since I've read maths
proofs.

------
hyperpape
Problem 7 has an even more fiendish counterpart:
[https://en.wikipedia.org/wiki/The_Hardest_Logic_Puzzle_Ever](https://en.wikipedia.org/wiki/The_Hardest_Logic_Puzzle_Ever).

~~~
quickthrower2
To make it more fiendish, what if the gods / natives don't know about each
other's TRUE/FALSE/RANDOM status (but they are excellent logicians, so they
can intuit any information from answers to your previous questions).

------
Sniffnoy
The "boxes in boxes" problem actually turns out to be true in much greater
generality, using a different solution:
[https://math.stackexchange.com/questions/1909085/suppose-
a-b...](https://math.stackexchange.com/questions/1909085/suppose-a-box-a-is-
contained-in-a-box-b-is-the-total-volume-of-the-k-faces-of-a)

------
johnc1231
I always liked the prisoner box question, but I prefer the phrasing where the
prisoners are assigned a number and the boxes are numbered. I feel like the
"prisoners have to come up with a name to number mapping" step just gets in
the way of the interesting part.

~~~
CamperBob2
I don't understand one assertion that the author makes in the solution,
though:

    
    
        If it happens that the permutation has no cycles of 
        length greater than 50, this process will work every 
        time and the prisoners will be spared.
    

Obviously that's not true as written, because the first prisoner has odds of
50% no matter what function or algorithm they use to choose the boxes they
open. If they fail in their initial guesses, the game stops immediately and
everyone dies. What am I missing?

Furthermore, if you simplify the case to two prisoners and two boxes, where
each is allowed to open one box, the odds of "success" are clearly only 25%.
What happens as the number of prisoners and boxes grows that _improves_ the
odds? This isn't a classic Monty Hall variation where the participants have
additional options as the game progresses -- it's completely predetermined.

~~~
fydorm
If it has cycles of 50 or less only, then they're guaranteed to find their own
name (they start the cycle on the box corresponding to themselves, so that
cycle must contain them). Incidentally, the "first prisoner" might as well be
every prisoner, because they aren't allowed to observe each other,
communicate, or modify the room.

~~~
CamperBob2
_If it has cycles of 50 or less only, then they 're guaranteed to find their
own name (they start the cycle on the box corresponding to themselves, so that
cycle must contain them)_

But there are 100 boxes, assigned at random. The prisoners can come up with a
mapping function that predetermines which boxes they will open based on their
names, but that function will have no relationship to the one (if any) that
was used by the warden.

There is no way to guarantee that the first prisoner finds his name, and that
seems to be true for all of the others. It must genuinely be a case where I
haven't understood the problem correctly.

~~~
function_seven
If no cycle is longer than 50 boxes (~30% chance of that being true), then by
starting with the box that matches your number, you have a 100% chance of
navigating to the box containing your number before your 50-box limit is
reached. It’s impossible to start in the wrong cycle, because that cycle
contains neither the pointer or the value.

You have to find your pointer in a circular linked list. But it’s only singly-
linked, so you start just in front of it and work around the links the long
way. 30% of the time, all the lists are 50 elements or shorter, meaning
everyone is guaranteed to succeed.

~~~
CamperBob2
But what if your name is in box #51, which you aren't allowed to open?

~~~
bewaretheirs
Then there's a cycle of length > 50 and everyone loses.

------
agf
Previous discussion:
[https://news.ycombinator.com/item?id=12380879](https://news.ycombinator.com/item?id=12380879)

Some additional fun ones in the comments there.

------
zawerf
Where can I find more of these?

Do you guys recommend this author's books such as
[https://www.amazon.com/Mathematical-Puzzles-Connoisseurs-
Pet...](https://www.amazon.com/Mathematical-Puzzles-Connoisseurs-Peter-
Winkler/dp/1568812019#reader_1568812019) ?

(It's hard to find the right search term for this that doesn't return a lot of
non-mathematical brainteasers or stuff aimed at kids)

~~~
ink_13
Winkler's book looks pretty good, yes.

You'd probably also enjoy the works of Martin Gardner, who wrote a
recreational mathematics column in Scientific American for 25 years, which has
been collected across many volumes.

------
augbog
Awesome post! I kind of want to find more of these types of things (solutions
to seemingly complex questions). If anyone has recommendations would love to
hear

~~~
jimmies
I read quite a bit of books of that kind when I was young. I didn't read in
English, but "One hundred thousand whys" \- maths in Chinese was a great one,
and this [1] in Vietnamese is a great one. They are all old books and some
have solutions we can figure out by using a computer, but in general, they are
really cool.

My personal favorite that has an English translation available, although it's
not on Maths is "Physics for entertainment" by Perelman -- you can get it on
Amazon right now. It was published 100+ years ago and problems are still
relevant now. It discussed questions such as "When you jump out of a train,
should you jump forward or backward?" "Who is wetter, a person standing in the
rain or a person running in the rain?"

I guess basic maths and science hasn't changed that much, or maybe I was
reading them when I was young, so I might not have known any better.

1: [http://khoia0.com/PDF-Files/80-BAI-TOAN-THONG-
MINH_handuc_v2...](http://khoia0.com/PDF-Files/80-BAI-TOAN-THONG-
MINH_handuc_v2.pdf)

------
JackCh
The worst question of the bunch is undoubtedly the tennis question. A very
clever alien could answer the other six, but having never been exposed to the
rules of tennis would be hopeless against the Wimbledon question.

(In fact my knowledge of tennis is poor enough that I don't understand why the
question would be considered difficult for somebody who played tennis.)

------
mikenew
Spent quite a while on the Wimbledon problem before giving up and reading the
answer. But the answer is wrong, because they don’t play tiebreakers at
Wimbledon (or any of the grand slams besides the US Open). Pretty frustrating
TBH. Kind of felt the same about the padlocked boxes one too, although at
least that one is more just a little ambiguous vs outright wrong.

~~~
arachnids
They don't play tiebreakers in the fifth set. All other sets work normally.
The answer is correct because it's only talking about the first three sets.

~~~
3131s
Yeah, but it's so easy! Why is that considered a hard problem? Just because
most people are unfamiliar with tiebreakers in tennis?

------
emsy
Honestly I already lost interest after reading the solution to the first
problem, because there's no indication that the prisoners are allowed to alter
the boxes in advance. This is such a dumb "gotcha".

Edit: I re-read the solution and it doesn't require actual labeling (It's hard
to imagine, because I'm not a native English speaker).

~~~
XCSme
What are you talking about? They don't alter the boxes in any way, they just
say "hey, I'm number 1, you're number 2, etc...".

~~~
emsy
you're right, thanks for pointing it out.

------
learnstats2
The Dot Town Suicides appears to be incorrect as stated.

Suppose there are 100 residents and 2 blue dots. The stranger tells everyone
"there are not 98 blue dots"

This meets the condition of the question's "non-trivial" but the residents can
relax: nobody has learned anything new. The proposed induction is broken, in
my opinion.

~~~
Hoagy
I thought this initially as well: if there are many of each colour, and the
only information is that there is at least one blue dot, how could this be new
information to trigger the suicides?

But this isn't correct. Let's say n = 6, a-c are blue, d-f are red, and the
information is that there is at least one blue dot. a knows there are at least
two, but thinks b may believe there is only one blue dot. Moreover, a believes
b may believe that c may believe that there are no blue dots.

So if a was red, the information would mean that b would commit suicide if c
did not initially commit suicide, but since neither has happened, a must have
a blue dot and therefore must commit suicide.

A convoluted process to be sure but does it make sense how the information
transmits by ruling out hypotheticals?

------
reificator
Likely posted because problem #1 was recently on the front page:

[https://news.ycombinator.com/item?id=16984815](https://news.ycombinator.com/item?id=16984815)

------
readams
The nice thing about this is that we can use Dot-town as a halting oracle,
which can then be used to solve any specific undecidable problem we'd want to
solve. Make the machine output a set of dots after running some unknown
computation, such as a search for twin primes. The residents will kill
themselves depending on the results.

~~~
gowld
How so?

~~~
tome
For a proposition P, make your statement of the form "there are N blue dots
iff P".

------
yay_cloud2
I would love to watch/hear someone go about solving these problems, just to
see their approach and thought patterns.

------
teddyh
“ _Communicating badly and then acting smug when you 're misunderstood is not
cleverness._”

[https://www.xkcd.com/169/](https://www.xkcd.com/169/)

I’ve stopped trying to do puzzles. I’ve been misled by bad communication
(intentional or otherwise) too many times.

------
brownbat
As to the dots, I enjoyed this variation from xkcd, where I first encountered
it: [https://xkcd.com/blue_eyes.html](https://xkcd.com/blue_eyes.html)

It massively improved our play at Hanabi:
[https://boardgamegeek.com/boardgame/98778/hanabi](https://boardgamegeek.com/boardgame/98778/hanabi)

Probably would be a useful puzzle for improving your Spades or Bridge game, or
any game you play with a partner with limited information.

Also, it gets much worse than The Random Native. George Boolos has a variation
called "the hardest logic puzzle ever." It goes like this:

“Three gods A, B, and C are called, in some order, True, False, and Random.
True always speaks truly, False always speaks falsely, but whether Random
speaks truly or falsely is a completely random matter. Your task is to
determine the identities of A, B, and C by asking three yes-no questions; each
question must be put to exactly one god. The gods understand English, but will
answer all questions in their own language in which the words for ‘yes’ and
‘no’ are ‘da’ and ‘ja’, in some order. You do not know which word means
which.”

There's an ongoing effort out there to make this variation even harder:
[https://arxiv.org/abs/1206.1926](https://arxiv.org/abs/1206.1926)

------
chias
I found the "trick" solution to Unwanted Expansion to be unsatisfying: unless
I am mistaken, it assumes that all of the values must be positive integers,
which was not stated as being the case.

~~~
zem
no, it's about the structure of the expression expanding indefinitely. if that
happened then if all the values were positive then the value of the expression
would also tend to infinity

~~~
chias
Ah, thank you for clearing up my misconception! That makes a lot more sense.

