We made an algorithm for Monte Carlo related random number generation. It makes slightly worse random numbers fast, but in a way which is good in some situations. I am still very excited about it!
I am a layman in this field and not read the paper (behind a paywall), but not sure I see how using a method that requires random numbers to generate random numbers makes sense?
Their work answers the question "how do I take what's coming out of /dev/random and turn it into a disctete probability distribution?"
A "random" process is one in which every outcome is independent and unrelated to every other outcome. Drawing "random numbers" doesn't always mean drawing them uniformly at random, where every number in some range, like 0 to 1, is equally likely. You can draw random numbers according to a probability distribution, which is a mathematical expression describing the probability of drawing one number or another.
One very popular distribution is called the "normal distribution" (named for the "normal equations" of physics) or the "Gaussian distribution" (named for Karl Gauss). Normally-distributed random numbers tend to follow a "bell curve" pattern, with values closer to the peak of the bell curve being more probable than numbers further away.
That's a continuous distribution. You can also have a discrete distribution. Some examples of physical processes that can be moseled with discrete probability distributions:
- Coin toss: do I get a Heads?
- A sequence of coin tosses: how many Heads do I get?
- A die roll: do I get a 5?
- A sequence of die rolls: do I get at least one 5? Is the sum of the die rolls at most 30?
- A poker hand: will I get a flush on my next draw?
- Traffic accidents at an intersection: will there be more than 2 accidents in the next 5 days?
These are all "stochastic" questions, meaning that there is uncertainty, in their answers. So we use probability, and therefore randomness, usually non-uniform, to model these scenarios.
Sometimes you need to generate new random numbers from these distributions as part of your analysis, or for generating predictions. In those cases, you need to take a continuous stream of uniform random numbers, like /dev/random, and turn them into a sequence of numbers that follows one of these distributions.
If you roll two dice you get 7 most often, 6 and 8 slightly less, 5 and 9 even less, etc.
Say you only have 1 die and want to generate the random number distribution that 2 dice give. Easy, just roll twice, and add them.
But what if you only have the output from 2 dice, and want to pretend to roll 1 die. You need a more general way of converting one random distribution to another.
It's not a method to generate random numbers, but to re-distribute random numbers according to an arbitrary distribution. In general RNG generate purely uniform distributions.
Please see our paper: An Efficient Method for Generating Discrete Random Variables with General Distributions for example from http://dl.acm.org/citation.cfm?id=2935745&CFID=782777315&CFT...
or just email to some of the authors to get the pdf ...