
How can a computer deal a poker hand? - johnaston
http://fermatslibrary.com/s/a-sample-of-brilliance
======
waqf
> _After solving a challenging problem, I solve it again from scratch,
> retracing only the insight of the earlier solution. I repeat this until the
> solution is as clear and direct as I can hope for. Then I look for a general
> rule for attacking similar problems, that would have led me to approach the
> given problem in the most efficient way the first time. Often, such a rule
> is of permanent value._

This has been one of my principles all my life, and I was just thinking
recently about the fact that I've never seen it written down anywhere. (Also,
about the fact that it really is a question of personal style, because I've
read through rough work/notebooks from very successful people who don't seem
to think that way at all.)

~~~
pavel_lishin
Isn't that a bit like "Plan to throw the first one away"?

[https://en.wikiquote.org/wiki/Fred_Brooks](https://en.wikiquote.org/wiki/Fred_Brooks)

~~~
tdb7893
It's crazy that a sofware engineering book from the 70's is still pretty
relevant

~~~
douche
True wisdom is timeless. Assuming we're still around for another two and a
half millenia, I imagine the _Art of War_ will still be read and studied.

------
wyldfire
> It is easy to generate a random sequence of inte- gers in the range l..N, so
> long as we don’t mind dupli-cates

Is it? I was surprised to learn of modulo bias [1]. Does the "Randint(1, J)"
implementation take it into account? It's a very easy mistake to make and can
have an impact on the uniformity of the shuffle.

If you haven't heard of it it has to do with the relationship between the
range spanned by your system's RAND_MAX and J. If RAND_MAX is not a multiple
of J, the highest range (RAND_MAX-J..RAND_MAX) will not deliver the entire
span of J.

[1]
[https://en.wikipedia.org/wiki/Fisher%E2%80%93Yates_shuffle#M...](https://en.wikipedia.org/wiki/Fisher%E2%80%93Yates_shuffle#Modulo_bias)

~~~
Chinjut
Any random number generator library worth its salt should have a properly
implemented RandInt(1, J) function available, instead of just a RandInt(1,
RAND_MAX) primitive.

Behind the scenes, mind you, it's easy to implement such things from random
number generators of a fixed range. For example, it could be done as follows
(for convenience, I'll speak as though our fundamental randomness primitive
just produces random bits, though all sorts of things would work just as
well):

We ought keep track, internally to our random number generator library,
persistently of some interval [Low, High) in [0, 1) (which can be initialized
to all of [0, 1) itself); any time a RandInt(1, J) is required, we partition
[0, 1) into J many equal intervals, find which one [Low, High) lies entirely
within, use the corresponding number as the value to return, and also modify
[Low, High) by applying to it the affine transformation which would take that
particular subinterval to all of [0, 1). If ever, while doing this, we find
[Low, High) spans multiple subintervals, we first do the following until it
does not: generate a random bit and then replace [Low, High) with either its
lower or higher half accordingly.

Essentially, we are using [Low, High) to keep track of an underspecified value
uniformly distributed throughout [0, 1), and then pulling out leading "digits"
in arbitrary bases of this value as required by the user, zooming our window
in accordingly after doing so. Random bits are used to further and further
specify this value, and thus no randomness ever goes to waste. At all times,
we will have made the minimum number of invocations of random bits necessary
to power the amount of random integers of whatever varying sizes asked for so
far.

~~~
bhickey
> pulling out leading "digits" in arbitrary bases of this value

Generally this bookkeeping isn't worth the complexity. Modern PRNGs are super
efficient: PCG is about 3x faster than Mersenne Twister.

~~~
Chinjut
Well, supposing one wanted "true" randomness and had access to a source of
such, but still cared to be efficient about using it.

~~~
bhickey
Sure, but in practice I think we'd be hard pressed to find someone with such
stringent requirements.

For all practical purposes, using your entropy stream to seed ChaCha is more
than good enough. Want something with a proof? Seed a Blum Blum Shub
generator.

------
cyberferret
This looks very familiar. I think I read part of it back in the 80's when I
was first learning programming. I think it was in reference to "How to truly
'shuffle' a deck of cards using a computer algorithm. Lots of discussion on
techniques to do close to true random shuffling of ordered decks.

~~~
rabz
It's Bentley's _Programming pearls_ column from the _Communications of the
ACM_, which was collected into a series of books, such as _Programming Pearls_
and _More Programming Pearls_, where you probably saw it.

------
jbritton
Is this a mathematically sound shuffle? It removes a random element from a
sorted deck and then adds the element to a new deck. Python code below:

import random

def shuffle():

    
    
        random.seed()   
        sorted_deck = range(52)  # [0..51]
        shuffled_deck = []
        slots_avail = 52
        while slots_avail:
    
            idx = random.randint(0, slots_avail - 1)
            shuffled_deck.append(sorted_deck[idx])
            del sorted_deck[idx]
            slots_avail -= 1
    
        return shuffled_deck

~~~
conistonwater
Try your algorithm with a smaller n first, instead of n=52, such as n=2 or
n=3, and slots_avail=n.

For n=3 it generates the permutation {1,2,3} with probability that is 11%
higher than the correct probability.

~~~
6nf
Would you mind sharing your proof of this? I don't think you are correct. It
seems to me that OP's algorithm is correct and will yield all permutations
with equal probability.

All he's doing is picking a random card from the sorted deck and moving it to
the top of the un-sorted deck. The sorted deck then becomes one card smaller.

For N = 2, In position 1 you choose card 1 with 50% and card 2 with 50%
probability, and card 2 is just the remaining card. Thus

    
    
        p(12) = 50%
        p(21) = 50%
    

For N = 3, In position 1 you choose card 1 with p=1/3, card 2 with p=1/3, and
card 3 with p=1/3:

    
    
        p(1xx) = 33% = 1/3
        p(2xx) = 33%
        p(3xx) = 33%
    

Then using the remaining 2 cards you just do the N=2 case from before, and so
you have:

    
    
        p(123) = 50% x 33% = 1/6
        p(132) = 50% x 33%
        etc

~~~
conistonwater
You're right, I think I misread the algorithm.

------
emmelaich
Great article from a classic series that every programmer should read.

Here's an odd thing -- when I copy'n'pasted the awk implementation the pasted
copy had errors reminiscent of OCR. For example the (j became Cj, the ARGV[2]
become ARGV121.

What's going on here? Is the website detecting the copy and make the browser
get something else? Is it Chrome - or something else doing some guessing?

~~~
vilhelm_s
I think the document is scanned from a paper copy. The pixels shown on the
screen is from the scanned image, but the text for copy-and-paste was created
by OCR when the document was scanned and prepared. You often see this with pdf
files, I guess the software used to create them has OCR functionality built
in.

If you use "inspect element" in your web browser you can see how it is
displayed. Each page is a <div> which contains first a big image (with the
scanned document page), and then a <span> for each word, which are positioned
using absolute positioning to be placed over the corresponding word in the
image. The <span>s each have "color: transparent", so you can't see them but
you can still select them.

------
frenchie4111
Been reading on Fermat's Library for a while now. Love the product!

------
jdmoreira
Does anyone know if this paper is part of the book Programming Pearls also by
John Bentley?

~~~
Someone
It's in _" More Programming Pearls"_ from the same author (who, by the way, is
called Jon Bentley, not John Bentley)

~~~
jdmoreira
Thanks!

