
Joys of Noise - dnetesn
http://nautil.us/issue/38/noise/joys-of-noise-rp
======
yummyfajitas
In general, most of the benefits of randomness are simply the benefits of
avoiding a particular low volume failure mode.

Stochastic resonance, for example, is a terrible example of the benefits of
randomness. You could do a LOT better with non-random perturbations.
Stochastic resonance means random perturbations + law of large numbers allow
you get a less quantized version of a signal. But if you have controlled (non-
random perturbations) you can get the same result in O(N) samples rather than
O(N^2).

Or similarly, consider "computer simulations". You'll nearly always get better
results if you use either fixed quadratures (in low dimensions) or use quasi
monte carlo (in high dimensions). Some months back I drastically sped up our
computer simulations by switching from MC to Simpson's Rule in 1-d
simulations.

I can think of very few circumstances where randomized algorithms are anything
other than a crude hack to get around forming a real understanding of the
problem.

~~~
amluto
> I can think of very few circumstances where randomized algorithms are
> anything other than a crude hack to get around forming a real understanding
> of the problem.

Randomization has a huge benefit where the input is adversarial. Think about
crypto (primality testing, proofs verifiable with randomized algorithms) and
even O(1) data structures like hash tables that can be made more resistant to
malicious input by randomizing them.

~~~
yummyfajitas
From what I can tell, this is the _sole_ benefit - in principle - of a random
number generator. If you assume an adversary who can predict a deterministic
world better than you can, then you can remove most of that adversary's
advantage by making the world unpredictable.

But that's a very far cry from what this article is pushing. Randomness in
this case is not making any system work better as the article discusses early
on (and incorrectly hints at in some of the examples). Randomness is just a
blind woman shutting off her lights in order to hinder a sighted man trying to
harm her.

~~~
amluto
This is getting rather off-topic from the article, but I think you're making a
stronger and more interesting claim than you realize. If randomness doesn't
help algorithms in general, then BPP (the class of languages decidable in
polynomial time by randomized algorithms with bounded error, more or less) is
in P or some other deterministic class, depending on exactly how you define
randomness not helping. There's all kinds of interesting research here.

As a concrete example, primality testing has been known to be very efficiently
solvable with a randomized algorithm for a long time, but it was only recently
that a deterministic polynomial-time algorithm was found. That algorithm is
considerably more complicated than the randomized algorithms.

See
[http://www.math.ias.edu/~avi/BOOKS/rand.pdf](http://www.math.ias.edu/~avi/BOOKS/rand.pdf)
for some more detail. I think there's a lot more research on the topic that I
haven't managed to dig up in three minutes of searching.

Edit: another use of randomness: expander graphs are quite useful in a number
of contexts. There are easy random constructions that provably produce
expanders with high probability, but I'm not sure whether anyone has found a
deterministic contruction that produces an expander.

------
kazinator
> _There are times when adding noise is the only way to pick up a weak
> periodic signal._

This just seems like way of randomizing the sampling to overcome the
resolution of the instrument, and the aliasing it implies.

------
xlxlxlx
Two related concepts that came to mind are fuzz testing and dither.

[https://en.wikipedia.org/wiki/Fuzz_testing](https://en.wikipedia.org/wiki/Fuzz_testing)

[https://en.wikipedia.org/wiki/Dither](https://en.wikipedia.org/wiki/Dither)

The discovery of the usefulness of dithering is interesting too:

 _" One of the earliest [applications] of dither came in World War II.
Airplane bombers used mechanical computers to perform navigation and bomb
trajectory calculations. Curiously, these computers (boxes filled with
hundreds of gears and cogs) performed more accurately when flying on board the
aircraft, and less well on ground. Engineers realized that the vibration from
the aircraft reduced the error from sticky moving parts. Instead of moving in
short jerks, they moved more continuously. Small vibrating motors were built
into the computers, and their vibration was called dither..."_

