
Don't forget randomness is still just a hypothesis (2006) - tosh
https://www.nature.com/articles/439392d
======
Xcelerate
When I first started learning QM in undergrad, I was skeptical of the idea of
randomness. Many years later after taking QFT in grad school... I'm still
skeptical. Non-locality doesn't bother me one bit, but personally speaking,
there's something deeply unsettling about the idea of "true" randomness.

First, I think it's important to define what randomness even is, for which
I'll use the most universal definition, i.e., Kolmogorov randomness. A string
of data is Kolmogorov random if, for a given universal Turing machine, there
is no program shorter than the string that produces the string (yes, you can
arbitrarily choose _which_ universal Turing machine, but the invariance
theorem makes this fact inconsequential for the most part).

So if we repeatedly set up and measure a quantum system that's not in an
eigenstate and then apply the probability integral transform to the individual
measurement values, we should expect to find a sequence of values drawn from a
uniform distribution, and this sequence should not be compressible by any
computer program.

This is where it gets interesting though, because it may very well be the case
that this sequence of measurement values is incompressible only because we
lack external information, i.e., we are looking at the Kolmogorov complexity
of the string from our perspective as experimenters, but from the perspective
of a hypothetical observer outside the universe, the conditional Kolmogorov
complexity (conditioned on some missing information) could indeed be less than
the length of the string.

So where could this missing information be stored? My guess is that it's at
the boundary of experimenter/experiment (not referring to a spacetime-local
boundary here), since you can't represent the overall quantum state of
experimenter + experiment as a separable state. That is, the information
necessary to perfectly predict the result of a measurement on a quantum system
is inaccessible to us precisely because we — the experimenters — are part of
the system itself.

In this way, quantum randomness would be truly random _from our perspective_
in the sense that the future is to some degree fundamentally unpredictable by
humans, but just because it's genuinely random to us doesn't imply the
universe is indeterministic.

I wonder if there's some way you could design an experiment that distinguishes
true indeterminism from the merely unpredictable...

~~~
Retric
> Kolmogorov random if, for a given universal Turing machine, there is no
> program shorter than the string that produces the string.

That’s not a definition of unbiased randomness. A true unbiased random number
could be all 0’s. Nothing about an unbiased random number demonstrates it’s
random, otherwise whatever that distinction is would be a bias in it’s
generation.

Kolmogorov complexity is it’s own thing, and sequences that seem very complex
can have extremely low complexity. Such as long sequences of hashes of hashes.

~~~
Xcelerate
I'm not sure what unbiased randomness is. I haven't heard that phrase before.
For Kolmogorov randomness, I was using Wikipedia's description of it
([https://en.wikipedia.org/wiki/Kolmogorov_complexity#Kolmogor...](https://en.wikipedia.org/wiki/Kolmogorov_complexity#Kolmogorov_randomness)),
although there are more technical descriptions available.

~~~
Retric
Crypto cares a lot about unbiased randomness. X bits of entropy is kind of a
measurement of this.

Anyway, I suggest you reread the end of that paragraph:

“A counting argument is used to show that, for any universal computer, there
is at least one algorithmically random string of each length. _Whether any
particular string is random, however, depends on the specific universal
computer that is chosen._ ”

Kolmogorov complexity is really referring to the fact you can’t have lossless
compression of arbitrary bit strings. You can’t encode every possible N+1 bit
string using an N bit string. The computer chosen can make an arbitrary 1:1
mapping for any input though. So, it’s got nothing to do with randomness in
the context of coin flipping as the mapping is predefined.

Just remember, you’re choosing the computer and at that point any input can be
mapped to any output. But, after that point limits show up.

~~~
xornox
If all "randomness" of the universe rises from a one extremely long bit string
which has been created once from the source of true randomness, the bit string
could contain unimaginable number of zeros only and it would be still random.
For example, Lotto numbers 1,2,3,4,5,6,7 may be completely random.

Kolmogorov complexity works only if we have a big sample of random numbers,
but we do not know if we have such in this universe or not.

~~~
Retric
Kolmogorov complexity is only meaningful for a specific architecture.

Without access to the architecture of the machine the universe runs on you
can’t tell if the initial random string would be one bit or nigh infinite
bits.

------
ameliaquining
> Neither Heisenberg's uncertainty principle nor Bell's inequality exclude the
> possibility, however small, that the Universe, including all observers
> inhabiting it, is in principle computable by a completely deterministic
> computer program...

Either I'm confused (definitely possible) or this is sort of implicitly
equivocating between two different senses of "determinism". There are
experiments we can perform that appear to demonstrate quantum randomness.
Though it may sound superficially plausible that any particular such random
outcome is actually the deterministic output of a hidden pseudorandom number
generator, that hypothesis _is_ ruled out by Bell's theorem.

What Bell's theorem _can 't_ rule out is the hypothesis that not only any
individual quantum measurement, but the sum total of everything that happens
in the universe including the experimenters' choices of what actions to take
during the experiment, is all part of a single deterministic causal path for
the whole universe, that just so happens to play out in such a way that we
never see anything that visibly contradicts Bell's theorem. This can't really
be empirically falsified, but there are various philosophy-of-science reasons
to be a priori skeptical of it (depending on which philosophers of science you
ask, of course).

~~~
chr1
Bells theorem rules out only local hidden variables. Quantum mechanics itself
works around the issue not because of indeterminism, but because wave function
collapses everywhere at once in a non local way.

But perhaps we do not even need to abandon locality, if we modify it a bit.
There is no good reason to believe that space at short distances should be
similar to Euclidean space, one interesting hypothesis is that space is more
like a graph, and entangled particles in addition to the normal long path
through the graph are also connected directly, which allows measurement on one
of them to change the state of the other.

~~~
cameldrv
I'm not a theoretical physicist, but I've heard in informal conversations with
some that one idea being explored more now is that perhaps space itself is
just a statistical emergent property of entanglement.

~~~
chr1
Leonard Susskind has several interesting lectures about it which can be found
by searching for ER=EPR, but i hope the final theory would explain more of the
strange behaviors of quantum mechanics, something like the theory outlined in
[https://blog.stephenwolfram.com/2015/12/what-is-spacetime-
re...](https://blog.stephenwolfram.com/2015/12/what-is-spacetime-really/)

------
6gvONxR4sf7o
Even more so, randomness as people usually use it exists independently of
"true physical randomness." If I flip a typical coin in a typical manner, then
regardless of whether quantum mechanics is truly random, a sufficiently
detailed measurement fed into a sufficiently powerful computer can predict it
essentially perfectly. It's a long way of saying a coin flip is close enough
to deterministic in practice.

But there's something going on there that I can't predict, and we need a
language to talk about it. That language is the same whether "true physical
randomness" exists or not. Calling a coin flip "50-50" is just as valid in a
deterministic universe as it is in a random one. Probability is a language
more than a theory.

Too many people get hung up on "true" randomness, when that's probably not
relevant to the the situation they're describing.

~~~
mlevental
this isn't true. given a chaotic system with enough degrees of freedom (even a
deterministic) there is no computer powerful enough.

~~~
6gvONxR4sf7o
No. Chaotic systems mean that if you integrate the time horizon out, there'll
be a time after which your measurement precision isn't precise enough, and
you'll get an unpredictable bifurcation. For a typical you-or-me coin flip,
the necessary precision for a hypothetically powerful computer to predict with
say 99.99% accuracy is probably well above the quantum-weirdness level of
accuracy. And that 99.99% accuracy (or even 90% accuracy) is different from
the 50% without the measurement and computer that necessitates a 50% theory or
50% language of coin flipping.

~~~
mlevental
>No. Chaotic systems mean that if you integrate the time horizon out, there'll
be a time after which your measurement precision isn't precise enough, and
you'll get an unpredictable bifurcation.

how is that substantively different from what i said?

>99.99% accuracy is probably well above the quantum-weirdness level of
accuracy.

what is quantum-weirdness scale and what is well above? hbar is 34 zeros out.

~~~
6gvONxR4sf7o
You said it's untrue that you can predict a typical coin flip because it's a
chaotic system. I laid out the limitations of chaotic systems and argued they
don't apply to a typical coin flip. Are you arguing about predicting
"essentially perfectly?" I use that to mean predicting with a couple nines of
accuracy.

By quantum weirdness scale, I mean that the accuracy (epsilon) of a necessary
measurement includes accurate position and momentum measurements. If epsilon
is too small, we might not be able to hypothetically measure both to the
necessary precision. I'm guessing that the necessary epsilon for essentially
perfect prediction is large enough that you can hypothetically measure both to
within that precision.

------
MarkMc
Headline reminds me of this Dilbert cartoon:
[https://dilbert.com/strip/2001-10-25](https://dilbert.com/strip/2001-10-25)

 _A troll tour guide says, "Over here we have our random number generator."
The troll places its hands on a slab of rock and relays the message of "nine
nine nine nine nine nine." Dilbert asks, "Are you sure that's random?" The
troll responds, "That's the problem with randomness. You can never be sure."_

~~~
Yajirobe
Maybe the message was part of the digits of pi
([https://en.wikipedia.org/wiki/Six_nines_in_pi](https://en.wikipedia.org/wiki/Six_nines_in_pi))?

------
dr_dshiv
What sort of actions are truly random? I can understand why a coin flip can be
viewed as deterministic, because if a precise robot had all the situational
knowledge, it could flip heads every time.

But radioactive decay is supposed to be random. As in, every atom is the the
same, but some, randomly, decay. That never made sense to me.

~~~
GregoryPerry
I sincerely doubt that anything is truly random, there has to be some type of
cosmic drummer behind the scenes biasing certain events. Case in point is the
conflict between molecular Darwinism and the numbers associated with the human
genome. Approximately one billion nucleotide bases and with four different
nucleotide bases in each location == 4^1,000,000,000 combinatorial explosion,
a number waaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaay larger than 10^120
universal complexity limit, 10^80 stable elementary particles in the universe,
and the generally accepted age of the universe ~10^40. The monkeys typing
Hamlet thing is computationally intractable.

~~~
gus_massa
The main idea of "molecular Darwinism" is that the initial life for had a very
short DNA [1]. While the species evolved, the short DNA evolves and got longer
[2].

* For example some genes are repeated, a bad copy may repeat a gene and the DNA get longer. Virus may cause some duplications too.

* Some genes are almost repeated,each copy has a slightly different function, so each one has a variant that is better for each function. The idea is that an error in the copy made two copies of the original gene and then each copy evolved slowly differently.

* Some parts of the DNA are repetitions of a same short pattern many many many times. IIRC these apear near the center and the extremes of the chromosomes, and are useful for structural reasons, not to encode information. The DNA can extend the extreme easily because it's just a repetition of the pattern.

* Some parte are just junk DNA that is not useful, but there is no mechanism to detect that it is junk and can be eliminated, so it is copy form individual to individual, and from specie to specie with random changes. (Some parts of these junk may be useful.)

So the idea is that the initial length was not 1000000000, but that the length
increased with time.

Your calculation does not model the theory of "molecular Darwinism". Your
calculation is about the probability that is a "human" miraculously apear out
of thin air with a completely random genome, it will get the correct one [3].

[1] Or perhaps RNA, or perhaps a few independent strands of RNA that
cooperate. The initial steps are far from settle.

[2] It's not strictly increasing, it may increase and decrease the length many
times.

[3] Each person has a different genome, so there is not 1 perfect genome. The
correct calculation is not 1/4^1000000000 but some-number/4^1000000000 . It's
difficult to calculate the number of different genomes that are good enough to
be human, but it's much much much smaller than 4^1000000000. So let's ignore
this part.

~~~
GregoryPerry
Again and irrespective of how much genome information was there initially and
what it eventually became, you are still talking about a final optimization
problem of 4^1,000,000,000. Even one tenth of that amount of the human genome
is an unfathomably large number to randomly iterate to given the generally
accepted statistics cited above. The math behind stochastic molecular
Darwinism doesn't work out at all.

~~~
fourthark
I don't see where you are getting the idea that humans had to be pulled out of
a hat of all possible genetic sequences.

They, like, evolved, right? As the GP says, there was a short sequence that
worked, a little got built on, a little more...

There was never any time that any creature was generated by random choice.

~~~
GregoryPerry
Got math?

~~~
GregoryPerry
What in the world are you talking about?

This thread of discussion is about the computationally intractable nature of
4^1,000,000,000

Got math? Maybe post a proof?

~~~
fourthark
Tell me why evolution would require all of those combinations to be tried?

Edit: Microsoft Windows 10 is 9GB. It would be impossible to try 8^9000000000
different programs. Yet, Windows exists, and most of us believe it's contained
in those 9GB.

~~~
GregoryPerry
So per your logic the Windows 10 operating system was created by random
iteration of x86 opcodes over a lengthy period of time? Huh?

~~~
fourthark
Exactly the opposite. Just because there are so many possibilities doesn't
mean that all of them have to be tried or make sense.

You wouldn't code that way and nature doesn't either.

------
hagreet
When I shuffle a deck of cards and put it down on the table the top card
doesn't change anymore. Yet anybody will consider the first card to be random.

In practice randomness is about lack of knowledge not about actual randomness
or processes.

(Edit: Sorry, but this clickbait title redefining randomness as something
other than what everybody understands it to be annoyed me.)

~~~
Chris2048
I'd argue probability is about that (given information at a time), i.e.
everything has probability 1 or 0 given perfect information (and
computation/inference) about a deterministic universe, but some other number
(by a metric) given partial information (and perfect computation/inference).

Randomness is perhaps a description of correlation? Which obviously relates to
probability, prediction of uncertain outcomes, but maybe more general?..

------
gus_massa
Note that with this definition everything is an hypothesis. Gravity is just an
hypothesis. If you tomorrow drop a rock the hypothesis is that it will fall
with an acceleration of 9.8m/s^2 = 32.2ft/s^2. [With the usual approximations,
like the air density and viscosity is small, it is a rock not an Helium
balloon disguised as a rock, the wind is not too high, ...]

We have not proven that if you repeat the experiment tomorrow, you will get
the same result, it is only an hypothesis.

The randomness of the QM is not proven, but for now we don't have a better
alternative to predict the results of the experiments. Just like gravity that
is has not been proven, but for now we don't have a better alternative to
predict the results of the experiments.

~~~
heavenlyblue
By that logic, the absence of the rabbit hole with the Cheshire Cat in it
isn’t proven, too.

~~~
gus_massa
Take a look at:
[https://en.wikipedia.org/wiki/Falsifiability](https://en.wikipedia.org/wiki/Falsifiability)

------
age_bronze
The equations of quantum mechanics aren't random, only its interpretation, in
the form of born's law.

Unitary evolution means there's neither information loss nor gain, and if
there was anything random you would at the very least expect to see
information gain (as new bits of information are created from the "random"
result of an observation).

Randomness in quantum mechanics isn't even a hypothesis, it's an
interpretation.

~~~
mrpara
Important nitpick: the equations that govern _dynamics_ in quantum mechanics
aren't random, and evolution is unitary. However, the process of "measurement"
is described by a (obviously non-unitary) projection operator onto one state;
the so called "collapse". If you, for example, attempt to answer the very real
physical question "given two particles in with some total joined state Psi,
one is measured and found to be in state Phi, what state is the other particle
in?", you would have to use such an operator. There isn't anything
interpretive about this, as such experiments have been done again and again.
It's a standard part of the mathematical framework.

Now, whether the underlying physics is truly random, or whether it's
deterministic and the projection only represents a sort of Bayesian update of
prior information (a la MWI), that is indeed a matter of interpretation. And
completely unfalsifiable by definition, and therefore not even really a
question for physicists. It's philosophy at best.

~~~
lorepieri
Cosmology was once considered philosophy and totally untestable. Just
saying...

------
goldenkey
A couple of things are at the heart of the matter here.

Hypercomputation (halting problem) and Infinite memory/storage

Ascribe either of these to nature, and nature can be deterministic and still,
the probability of us discerning its RNG's operations is 0.

This is a good video explaining the different intricacies of how "God's dice"
might be constructed:

[https://youtu.be/iBHUayT7t6w](https://youtu.be/iBHUayT7t6w)

Many phenomena thought to have been random due to their quantum nature have
been found to be based on their initial conditions instead. See spontaneous
emission photon phase for example:
[https://iopscience.iop.org/article/10.1088/1367-2630/9/11/41...](https://iopscience.iop.org/article/10.1088/1367-2630/9/11/413/meta)

------
ChrisSD
It's worth mentioning that most cryptographic algorithms are designed to be
strong using pseudo random algorithms, so real randomness isn't a requirement
(although obviously some unpredictable starting point is required to get the
ball rolling).

Well unless you use a one time pad but nobody does (hopefully).

~~~
A2017U1
> Well unless you use a one time pad but nobody does

Was under the impression intel agencies and militaries use OTP's regularly and
the keys are carried in diplomatic pouches around the world.

------
Throw_Away_5725
>Neither Heisenberg's uncertainty principle nor Bell's inequality exclude the
possibility, however small, that the Universe, including all observers
inhabiting it, is in principle computable by a completely deterministic
computer program, as first suggested by computer pioneer Konrad Zuse in 1967
(Elektron. Datenverarb. 8, 336–344; 1967).

Leibniz has already proposed essentially the same centuries earlier. And there
have probably been people who said the same even earlier.

(Also, mentioning Konrad Zuse made me directly suspect that the author is
German. He's Swiss, but close enough.)

------
oldandtired
In terms of the physical universe, all of our models/theories and laws are
based on incomplete knowledge and observation. That being saaid, our models
and theories provide a means of investigation of the universe. Where we need
to be careful is coming to the idea that we "will" be able to "know" what is
happening. We do not have "infinite" knowledge (there is always more to learn)
and so any models and theories we come with can (and will) be superceded by
later data that we collect.

"Randomness" in the small can and does appear to be non-random in the large -
we make predictions as to what we will see over large numbers of events when
we are unable to determine if any single event will fulfill that prediction.
Radioactive decay is a good exampe of this. Two-slit refraction patterns are
also another example. Much, if not all of our technology, depends on this,
whether this be semiconductor design or manufacturing any material products
such as steels or concrete.

What does happen is that we have more and more interesting research areas in
which we can investigate the underlying principles that govern our universe.
But, we must not make the mistake that we will "know" what those principles
will be be. We can and do develop workable and useful models and theories to
help us get a handle on understanding this universe we live in. We live on one
small planet in an isolated region of our galaxy in an extraordinary and
immense universe. We do not kave the ability to explore that universe in any
detailed way except by proxy observations. So, instead of getting caught up in
being "sure", let us have fun in exploring everywhere we can and continue to
gather data and discuss what this data means and develop workable theories and
models that we can use.

As a disciple of the living God who created all the that we see and do not
see, I consider that the universe has a set of specific rules and laws by
which it operates and that we can and should try to understand what those laws
are. For me that is an act of worship to investigate and understand the what
and how.

For those who are of other belief systems, whether that be Hindu, Buddhist,
Moslem, Atheists, etc., there is just as much an incentive to study the
universe around us and understand what and how it works. There may be
additional questions that might be raised from each viewpoint that is not of
concern for any of the other viewpoints like "why".

BUt what it all boils down to, is that we live in a wonderful and extremely
interesting universe and there is much to learn about it and have fun while
learning about it.

------
GregoryPerry
Little if any empirical research has been done into the quality of entropy
associated with repeatedly observing a quantum state. Pretty easily
accomplished using an RTOS (RT_PREEMPT or Xenomai) with GPIO sampling, and
with n-dimensional phase space analysis of that dataset to determine if any
patterns emerge. There are plenty of tools from the field of chaos theory and
nonlinear dynamical systems analysis to prove or disprove the fundamental
nature of quantum randomness with.

~~~
jackfraser
You could probably run something on D-Wave Leap and get actual quantum noise
or something if you like

------
klodolph
This is one of those questions that ultimately comes down to the exact
definitions you use for everything, because the commonly-used interpretations
won’t cut it. Same bucket as questions like “does free will exist?” (define
“free will”)

I started writing a longer version of this comment but I think that a core
part of the question is whether “randomness” is an epistemological
convenience, a statement about “order“ or “rules”, or something else.

------
yters
What I don't understand about quantum physics is why doesn't everything just
become a statistical smudge after a couple iterations? Why is reality so
coherent?

My takeaway is that this is the reason some physicists claim 'everything is
information' because there is some underlying form that gives the statistical
quantum physics a consistent pattern instead of devolving into randomness.

~~~
lorepieri
Information theory defines information starting from probabilities, so there
is no much escaping from that.

~~~
yters
I think the physicists mean something like mutual information, i.e. some
underlying reality describes the global probability distribution better than
the individual probabilities themselves. It is this underlying reality that is
'really real' and the quantum fluctuations illustrate it like a flickering TV
screen shows a globally consistent picture.

------
AlexDragusin
Back in 2009 Alex Dragusin hypothesized: "In a finite space there must be a
finite number of events, which are strictly related (in a cause-effect chain)
to the finite constraints, are therefore not random."

In the absence of high enough computational resolution one would perceive this
as randomness. This is also related to the quest for determining if we live in
a simulation.

~~~
lorepieri
"In a finite space there must be a finite number of events" seems to have
counterexamples: take events which occur on vanishing small space, then you
can fit an infinite number of them in finite space. Think about converging
infinite series.

~~~
goldenkey
I think the author means discrete when they say finite. In that case, they
mean Doubly special relativity.
[https://en.m.wikipedia.org/wiki/Doubly_special_relativity](https://en.m.wikipedia.org/wiki/Doubly_special_relativity)
Doubly special relativity - Wikipedia

------
mattfrommars
I'm going to try out my luck here. In not so distant past, I came an article
whose content was something along the line of 'theory of randomness' or
'history of randomness'. As what the website looked like, it looked as if
written in basic HTML/CSS.

If someone know which site I might be alluring to, please post it here.

------
ncmncm
A pseudo-random noise source may be just as good as the real thing, for
producing satisfyingly even-handed results.

But it turns out to be a lot harder to design a really good emulator of
randomness than one might guess. Certain Monte Carlo simulations using the
Mersenne Twister turned out to be oddly but unmistakeably biased.

~~~
jackfraser
Can you explain to a layman what's odd about it?

~~~
ncmncm
Mersenne Twister was maybe the first in a class of random number generators
that has lots of state -- an order of magnitude more than previous designs.
Each time a number is pulled from it, some of the state is stirred, and the
next number comes from mostly other bits, and stirs others. They have to be
fast, so you can't touch too many bits per number extracted; taking about the
same time for each number is nice, too.

So, one measure of generators is how many numbers you pull before you get the
same sequence again. MT's cycle is very long, so in practice you never see a
repeat, even if you see the same number many times. (In many simpler
generators, seeing 3 then 8 means next time you see 3, the next number will be
8. A great deal of simulation was done with such generators.) The numbers from
an MT satisfy many different measures of apparent randomness.

Monte Carlo investigations consume very many numbers. They might use the
numbers in a more or less periodic way, so that any match-up between cycles in
the problem and cycles in the generator can skew the results. The main MT
cycle is very long, so any skewed results probably point to lesser cycles as
the bits stirred are later encountered again. But it's hard to imagine a way
to detect such cycles deliberately from the bits you get out. Encountering a
process that finds them accidentally is amazing.

~~~
jackfraser
Fascinating, thanks. Not sure I understood it all, but I appreciate the reply.

------
hajderr
In relation to religion, this is interesting too, the Big-bang that happened
'randomly' could also be explained as a series of cause and effects. On
another note, also a miracle according to scripture, could be a series of
cause and effects fast-forwarded basically.

------
voxl
It's comical that the article suggests we always try to falsify randomness,
when the "simple" deterministic explanations (Superdeterminism) at this point
are unfalsifiable.

