Hacker News new | comments | show | ask | jobs | submit login
What is randomness? (2013) (etceterology.com)
41 points by tapan_k 342 days ago | hide | past | web | 57 comments | favorite



E.T. Jaynes had a very interesting take on randomness. From the beginning of his book (http://bayes.wustl.edu/etj/prob/book.pdf):

> [George Pólya] dissected our intuitive "common sense" into a set of elementary qualitative desiderata and showed that mathematicians had been using them all along to guide the early stages of discovery, which necessarily precede the finding of a rigorous proof. The results were much like those of James Bernoulli’s "Art of Conjecture" (1713), developed analytically by Laplace in the late 18th century; but Pólya thought the resemblance to be only qualitative.

> However, Pólya demonstrated this qualitative agreement in such complete, exhaustive detail as to suggest that there must be more to it. Fortunately, the consistency theorems of R. T. Cox were enough to clinch matters; when one added Pólya’s qualitative conditions to them the result was a proof that, if degrees of plausibility are represented by real numbers, then there is a uniquely determined set of quantitative rules for conducting inference. That is, any other rules whose results conflict with them will necessarily violate an elementary—and nearly inescapable—desideratum of rationality or consistency. But the final result was just the standard rules of probability theory, given already by Bernoulli and Laplace; so why all the fuss? The important new feature was that these rules were now seen as uniquely valid principles of logic in general, making no reference to "chance" or "random variables"; so their range of application is vastly greater than had been supposed in the conventional probability theory that was developed in the early twentieth century. As a result, the imaginary distinction between "probability theory" and "statistical inference" disappears, and the field achieves not only logical unity and simplicity, but far greater technical power and flexibility in applications.


"but less important than the process of natural selection, which is not random at all."

This is wrong. A random process (gene mutation), within a randomly changing environment - is a 'net random process'.

The author tricked himself.

The purely physical/materialist perspective does have this paradox: we're just purely random bags of particles, and there actually cannot be any such thing as 'intelligence' or 'life' or 'love' or 'language' - just the appearance of it.

If you throw a bag of a trillion^trillion^trillion particles into a purely self contained environment (i.e. the Universe), and let it stir for a while - whatever is going on - at least from a materialist perspective - is random.

Scientists realize this and there's a new field of thought called 'emergence' which at least tries to grapple with it 'one step' away from materialism, and they toy with the idea that properties of complex entities may 'emerge' independent of their more simplistic constituent parts.


It's very clear from the context - which you've omitted - that they meant the selection part rather than the mutation part:

"random mutation is an important element in evolution, but less important than the process of natural selection, which is not random at all"

So their point is that random mutation occurs but is then filtered non-randomly, like a high-pass filter on noise. It's still a bit of an overstatement to say "not random at all" since there is obviously some (bad) luck in "fit" individuals dying or "unfit" individuals surviving. In the long-run though, you can see their point.

They go on to make a ridiculous claim that because you are here, nothing random happened to your ancestors. I would say it's just one possible history, and perhaps everyone makes mistakes with randomness.


It seems you're saying that the 'fitness algorithm' filters randomness and therefore the result is 'less random.' And while I can see your point, I'm of the mind that filtered noise is just less noise. But it's still noise.


If I give you a stream of decimal digits, and you filter out everything except for 9s, would you call the resulting stream as random as the input stream?

How about if you filter every digit that is not exactly one more than the last unfiltered digit?


> If you throw a bag of a trillion^trillion^trillion particles into a purely self contained environment (i.e. the Universe), and let it stir for a while - whatever is going on - at least from a materialist perspective - is random.

This seems to lead to a meaningless/useless definition paradox. 'Randomness' comes to mean everything and nothing. This is one of the things that led me away from an interest in the materialist/physicalist position. Like you said elsewhere, it seems hard to swallow the idea that all that we can achieve in defining 'randomness' is a negative definition. Personally, I find it problematic that materialism has no positive definition of what 'randomness' is and does not seem capable of offering a meaningful/useful definition.


Sort of by definition, you can't fully describe a random process. (A little fiddling actually turns that in to a pretty good definition of what a random process is, but that gets really technical, really quick.)

I doubt you'll find any better usage of randomness in other fields, and in many ways, "random" is used to censor things our theories can't represent, and wrap them in stochastic approximators. This is useful, because we can then use the stochastic models as a bound on the possible outcomes caused by things we can't model (for various reasons). This let's us calculate useful predictions involving things we don't or can't know.

I don't think randomness is a negative definition, just that our primary usage of the concept is boxing up unmodelable things.


Are you saying you think dualism has some persuasive explanatory power?


I'm sorry but, if you "threw a huge number of particles into a self-contained environment and let them stir for a while," you should have a purely deterministic, non-random, and if you could get the data and do the math, a predictable situation. Maybe I'm missing something, or we're talking about different things?

Nietzsche's "eternal return" posited just this type of situation, a limited (though huge) amount of stuff, and a limitless amount of time, well with those two you'd relive your life an eternal amount of times, presumably.

As an aside, I personally find a warming way of thinking of this idea by fictionalizing it to imagine that "I" might live a future life of every other person and even animal. Maybe by thinking like this I will be more likely to treat them as ends in themselves, and not as a means to an end, that's to say, closer to Kant's "categorical imperative". That would be nice.


"you should have a purely deterministic, non-random, and if you could get the data and do the math, a predictable situation."

From a Newtonian/Classical perspective - yes.

But we're at quantum physics now - so no.

Every single particle interaction is basically 'inherently random'.

Also - what you are saying would apply to 'us' as well, that we are all deterministic ... again, because we're living quantum (as far as we know) - no, it's random.

Purely random is where materialism stands now.

I should add that materialism is just one metaphysical presupposition.


"you should have a purely deterministic, non-random, and if you could get the data and do the math, a predictable situation."

From a Newtonian/Classical perspective - yes.

Well, maybe not! It's possible that indeterminism may be found in classical mechanics; the classic example is Norton's Dome, discussed in [1] with another reference in [2]. If you read [1], you'll see that there isn't yet any general agreement as to why its paradox is invalid. That's a potentially fruitful avenue for research, methinks.

[1] https://en.wikipedia.org/wiki/Norton%27s_dome

[2] http://hsm.stackexchange.com/questions/2678/history-of-the-s...


Very interesting!


What I don't understand is how much indeterminacy the quantum allows into our Classical-seeming world. At what levels?

I completely agree that the classical determinism I mentioned also applies to us. One of my questions is whether and then how and then to what degree the randomness from quantum events can allow free will and then free action.

So, I wonder how much quantum randomness affects things. For example, atomic clocks face indeterminacy but are all the same the most accurate clocks we have, by far. Of course, once you add things like eternities, this can surely get very different... But 13 billion years is very far from an eternity...

Also I do understand that materialism is only one of many metaphysical suppositions. But it's the one that we can seem to do the most with. Hard to do much with metaphysical idealism, for example...

Anyway I'm no scientist or mathematician or philosopher so I'm out of my depth here. I'd be interested in your thoughts though. Thanks ~~


Neither a philosopher nor a physicist, but I'm pretty certain that our Universe is not deterministic, and the future cannot be predicted even with a computer with unlimited power.

But I see your point about how some macro-level things seem to be that way.


Emergence is the principle that people are more than the sum of their parts: they're the product.


Emergence is the idea that properties may arise in complex ensembles which have descriptions which are best describe by the collection. These properties are not best describe by the behavior of the simpler constituents.


Randomness is contextual.

Two popular contexts for which this works are (a) I am attempting to predict an event and cannot do so deterministically using information I and resources R or (b) I am attempting to characterize a sequence of events in an efficient way. Thus, randomness is thus always determined against a resource constraint.

For instance, the digits of pi are random in the context where you aren't aware that the digits are arising from a deterministic process and don't have the resources to discover this.

Another interesting example concerns the randomness of a PRNG. Given knowledge of the algorithm and the hidden state it is obvious that this is deterministically predictable, but eliminating the hidden state information destroys the predictability of the PRNG events.

For the sequence of events modality, efficiency is important since we've lost the context of discovery/prediction. For instance, any finite sequence can obviously be perfectly described by itself, but we're often interested in compression. One can imagine an optimization problem where we want to achieve a representation of the sequence which has the simplest model and the lowest cumulative "error". Error and model can be defined in many ways, but oftentimes models have a "random" nature to them. It's an efficient shorthand in situations where the deterministic explanation for a sequence is difficult or impossible to describe.


I miss the obligatory mention to D.E. Knuth - The Art of computer programming:

http://www.informit.com/articles/article.aspx?p=2221790

I remember a friend and I used to play with this chapter while in college, complaining how bad the random number generator in Pascal was because it could not pass a cube test (get 3 random values as x,y,z points and plot them in a cube - after 1000-2000 iterations the cube showed a lot empty places - some PRNGs even showed checkboard patterns).


"Even Dr. Steven Novella, host of the Skeptics Guide to the Universe podcast, has said on the air that the digits of pi are random. They are not. First of all, they are 100% predictable by calculating pi."

Does this mean that randomness is just our inability to "predict"? If we could see the future just as clearly as we see the present and the past, would there be randomness?


I find it's often more useful to replace the "random" property of an object with the "incapable of predicting" property of the observer/consumer/receiver/attacker/whatever.

This is basically the approach taken by information theory: one of Shannon's many insights is that the content of a message doesn't matter for communicating it; only how well the receiver can predict it. If there are two possible messages, it doesn't matter if they're "yes"/"no" or "invade Portugal"/"bring me a sandwich", the information being communicated is the same (1 bit).

In this way, something is "random" if it's maximally unpredictable, i.e. if it can't be compressed. If we pick a Universal Turing Machine as the receiver, we get the characterisation of randomness given by Kolmogorov complexity.


From https://en.wikipedia.org/wiki/Kolmogorov_complexity#Kolmogor... :

Kolmogorov randomness defines a string (usually of bits) as being random if and only if it is shorter than any computer program that can produce that string. To make this precise, a universal computer (or universal Turing machine) must be specified, so that "program" means a program for this universal machine. A random string in this sense is "incompressible" in that it is impossible to "compress" the string into a program whose length is shorter than the length of the string itself. A counting argument is used to show that, for any universal computer, there is at least one algorithmically random string of each length. Whether any particular string is random, however, depends on the specific universal computer that is chosen.


Yes, one way to grapple with the concept of "random" is to define it as "unpredictable", but that does imply that the randomness of an event is contingent on the observer.

According to Information Theory, the measure of the information content of a message is its unpredictability.

This seems to imply that randomness is information. (This is also supported by Kolmogorov, randomness is less compressible, therefore must contain more information.)

An interesting question is, where does this information "come from"?

And yes, if the omniscient God can predict everything He is an observer to Whom nothing is random.

Now, is He trapped in a static Universe, or can He effectively "stir the pot", and bring "new" information/randomness into existence? But at this point the discussion moves from mathematics to philosophy and right into theology, so ... um... yeah...

Numbers, eh? What a trip.

-----

I just realized this comment is basically just rehashing what chriswarbo said... :-/


Is God powerful enough to create information that even He could not predict? I think these paradoxes actually hi-light that there's something wrong with the idea of an omniscient agent, at least as we conceptualize it.


Let's put quantum mechanics aside and suppose we live in a universe that is 100% deterministic. In that universe, we still can't predict chaotic and overly complex things very well, like rolling a die or shuffling a deck of cards, even though in principle it would be possible if we had incredible computer models and incredible knowledge about the state of the universe beforehand. That's the kind of randomness that we're talking about when we talk about predictability. It turns out most kinds of randomness most everyone deals with is that kind of randomness. Even in the our universe with all its quantum weirdness, we could in principle predict perfectly predict most die rolls with good enough measurements and computers. It turned out that's what we meant by randomness all along. It isn't tied to physics or philosophy. Most of the things we call random, aren't truly "random" in the strictly physical sense, so we realized that we ought to have a theory that is agnostic to physics and philosophy. It turns out that the predictability version of randomness does a great job at both, so we don't have to talk about that other stuff when we apply it.


Randomness is a statistical quality and isn't necessarily tied to physical phenomena. It just depends on to what you apply the statistics.


I agree with all of this, but the question of true randomness has, of course, the greatest importance to it.


I had similar thoughts but was too embarrassed to ask as I am not a mathematician.

Basically is randomness a human idea/concept or does it really exist? This is sort of along the lines of whether or not Math really exists (mathematical structures) or is just a human way of modeling the universe (Mario Livio and few others have written some novels on this).

I don't know the answer but it seems to have a habit of getting philosophical rather quickly.


It gets a lot easier if you restrict yourself to a definition of random that puts some restrictions on the entity that needs to predict the events. If you for example say that you can only spent a polynomial amount of time looking at the first k bits to predict the k+1-th bit, then you have a rigorous definition where you can prove that such randomness exists (at least using standard complexity theoretic assumptions).



The definition of randomness is that which is statistically unpredictable. There would still be randomness if we could remember the future because we don't have perfect information about the past, and the universe is inherently random at the quantum level.


Not necessarily. Perhaps we live in a many-worlds universe, with no collapse.


Consciousness depends on decoherence (collapsing into one possible universe), which happens randomly (there's no way to predict which states you observe).


Our subjective experience feels like being in one Everett branch, sure. But if you really were in a superposition, what would you expect to experience differently?


I don't know if subjective experience is possible while self-coherent. You need to entangle with only one of your states to have any experience.


I don't understand your claim.

Interacting with something in a superposition creates entanglement. In practice most of you is in contact with most of the rest of you, so any superposition quickly spreads to your whole body and more generally the wider universe. At that point we can call it "decoherence" but there's no collapse. Just your wavefunction becomes cleanly separable into distinct pieces. But the subjective experience of being in either component would feel, well, normal.


Yes, it's normal so long as you are in one component or the other. I don't think that you can experience a superposition of being coherent among both components—of existing in both worlds. I think that we're always unconscious in a Schrodinger state.


Some people like to point out a difference between random and uncertain.

- Random: it's 1/6 for each result, but I don't know which it will be.

- Uncertain: I don't know even how the distribution is, or what the variables are.

Random can also be broken down into "there's no way with any information you can say what the outcome will be" and "you don't have the information to the precision you need, so to that extent it's unpredictable". So for instance there's non linear dynamic systems where teeny tiny variations cause the outcomes to vary a lot. They're actually deterministic, but they feel random. With quantum stuff like decay, you actually can't know when exactly it happens, but you can say something about the distribution and how the distribution is affected by various laws. For instance, there's a famous muon decay experiment where special relativity changes the decay rate.


I think by uncertain you are referring to Knightian uncertainty.


Your "random" is also called probability.


And even bloggers get randomness wrong!

Most of us have an intuitive sense that random things are evenly distributed, which is true in the very very long run, but not true at all on the scales we generally experience things.

This is not true, actually. A random variable can have an even distribution, but it doesn't have to. It might have a normal distribution, or even some kind of skewed distribution.


"Random" is a fancy way of saying "the gods" did it, which in turn means we have no idea why things went down that way.

In earlier times if you rolled dice and won, you would think "the gods favor me today". Nowadays, you think "I am lucky today". In almost all cases where an analogous situation can be found historically, randomness (sometimes combined with extremely long periods of time) replaced God as an explanation.

Also, already there are people who worship RNGesus in the extremely rudimentary virtual realities we can construct.


Strange that the article mentions nothing about Kolmogorov randomness[1].

[1] https://en.wikipedia.org/wiki/Algorithmically_random_sequenc...


I'm weary of the definition of random being 'unable to predict' - because that means randomness is entirely contextual, and not objective.

As for 'the numbers of PI' - well, they're purely random in some contexts, certainly not in others.

Though it may difficult to produce - surely, there exists a theoretical random number generator that produces numbers which cannot be predicted in any context, and is therefore 'truly random'.

I believe some quantum interactions are, as far as our current understanding, 'random' in this regard, no?


> I'm weary of the definition of random being 'unable to predict' - because that means randomness is entirely contextual, and not objective.

Tell me more, what do you mean by this? If the context changes the outcome, doesn't that imply some predictive power?

> Though it may difficult to produce - surely, there exists a theoretical random number generator that produces numbers which cannot be predicted in any context

Attach a Geiger counter to a computer, and you have a non-theoretical true random number generator. Is that what you meant, or are you talking about software RNGs?

https://en.wikipedia.org/wiki/Hardware_random_number_generat...

One of the challenges with physical inputs is ensuring the RNG output is uniform over time, e.g., equal chances of numbers between 0 and 0.5 as between 0.5 and 1.


The negative definition of 'unable to predict' is a definition that depends on the contextual information of an observer - is what I am saying.

By that definition, nothing is truly random - it just depends on the ability and knowledge of an observer in a given context.

You say 'geiger counter is random'. Well - what if you had a scanner, and a powerful computer, and knew the arrangement and composition of the material, and could 'predict' on some level, when those atoms would produce radiation (I know this is probably impossible) - but suppose you could. Then it would not be 'random'.

I wonder if we can consider 'randomness' as an inherent quality of a system ...

Though I admit 'something that is not predictable' is a nice way to communicate it, maybe even help regular people understand it better.


Ah, you're talking about the philosophical concept of whether random even exists.

As far as we (humanity) know, true randomness is still a thing. Quantum effects have no known predictors. From the article I linked: "Quantum mechanics predicts that certain physical phenomena, such as the nuclear decay of atoms, are fundamentally random and cannot, in principle, be predicted"

So Quantum science believes that true random does exist. Until someone demonstrates otherwise, or I start working on the problem, I'm good accepting that for now. The tests to demonstrate these ideas are fascinating: https://en.wikipedia.org/wiki/Bell_test_experiments

In the mean time, there's the very real concept of whether something is practically predictable, and a Geiger counter is currently not practically predictable. (And worth noting, it lands into the category of currently not theoretically predictable too.)

Personally, I feel like the definition "unpredictable" is quite good, especially for regular people. Going deeper than that requires all kinds of baggage and explanation. But the point the author made is very good -- people tend to start expecting things when "randomness" is involved, and that expectation is a problem, because randomness is unpredictable.


Agreed. I tend to cringe at essentialism, which can be applied to any concept. At the same time, I cringe at its opposite. If the idea of an unknown entity is not entertained, held in suspension as it were, in one's mind, I think that theory suffers. As you point out, "practically predictable" is where I would stand. Or "random enough". When I was little, in France, I was explained that the Meter was defined according to the length of a platinum alloy standard sitting in an institute in Paris... Bottom line: Randomness, is a philosophical concept, predictability is a scientific concept, asymptotic to 0, another needed absolute...


> By that definition, nothing is truly random

First of all, that statement doesn't really make much sense. By that definition, everything that you are unable to predict is random (to you), so, assuming that there are things that you are unable to predict, there are things that, from your perspective, are truly random, by that definition(!).

Unless, of course, by "truly random" you mean "random as per the definition that I prefer", in which case all you are really saying is that that definition differs from your preferred definition, which might be true, but isn't really much of an argument.

> I wonder if we can consider 'randomness' as an inherent quality of a system ...

No, that would be a useless concept. Randomness as an inherent quality of a system would be equivalent to the claim that we will never be able to predict the respective attribute of the system, i.e. it's a claim about the impossibility to ever know something. It's a perfectly valid concept, in that there might be things that we will never know, but it's utterly useless because we will never know that we won't ever know them, i.e., it's undecidable which elements are in the set and which are outside, unless we figure out how to predict them, in which case they don't fall under either definition anymore. The only thing we can say is that we don't know how to predict something right now.

The closest you can get to "objective randomness" is something like bell's inequality--but even that is falsifiable in principle.


> what if you had a scanner, and a powerful computer, and knew the arrangement and composition of the material, and could 'predict' on some level, when those atoms would produce radiation (I know this is probably impossible)

Quantum theory specifies that this is impossible not just in practice but in principle. Radioactive decay on an individual-atom level is literally impossible to predict, and not just because we don't have sensitive enough measurements. "Arrangement and composition" don't have their standard meaning at atomic scale; particles are probability fields, not BBs.


It depends on whether you want randomness to be a mathematical or a physical concept. If you want randomness to be a physical concept, then it depends on interpretations of quantum mechanics and physics in general. If you let it be a mathematical concept independent from the physical universe, then it includes the physical version, and so, so much more. It doesn't mean that randomness is entirely contextual, but that it can be applied to those events too. It's a more general definition.


You can make it somewhat objective, e.g. a "Martin-Löf random" real number is a number where no algorithm can predict the next digit from previous digits better than chance. A specific real number can have that property, in fact most of them do. (Before you ask, pi is not random in that sense, because there's an algorithm that predicts digits of pi with 100% accuracy.)


The wikipedia states:

>Randomness is the lack of pattern or predictability in events.

It may be a better phrasing.


I understand the negative definition concept, and the practicality of it ... I wonder if there is a better way to describe it.


There are two kinds of random that would fit proposed definition: true random and cryptographically secure pseudorandom.


true random is random?

Are you sure? :P


It depends on our ability to deduce anything about its underlying process.


To be honest I found this article a bit of a let down.

I'm no mathematician or physicist, but I've thought about randomness a bit, though I don't have a satisfying understanding of it.

First, we probably (haha) need to at least distinguish between epistemic randomness -- an outcome is unpredictable because we the observers have imperfect knowledge -- and real randomness, where an outcome is simply unknowable, period, and therefore undetermined and separate from the laws of classical physics. (That's where the quantum part comes in.)

A roll of the dice is epistemically random -- we don't know how it will land because we can't get and calculate accurate data quickly enough. Presumably a god could. Apollo, the god of prophecy and Delphi seems fitting here. Don't go playing dice with him. He sees the dice in mid-air and knows on what atoms they'll land, what the windspeed is, the exact weight and spin, he saw the thrust of your fingers as they tossed the dice, and he can call out the result before they land and fall still. Indeed, if the universe is deterministic, he could have known since the beginning of time what you'd roll today. Lord Apollo would simply consult the great chain of cause and effect. Likewise, he'd know the outcome of all of your pseudorandom generators. The article seemed to deal only with epistemic randomness, right?

Then there's the "real" randomness that is also called ontic (as in ontological), or quantum randomness. According to quantum mechanics if I've understood it correctly, electrons decay at _truly_ random intervals, intervals that have no cause whatsoever. This is perhaps to be disputed, or will be disputed, but hey, I'm happy that there's at least a chance for some indeterminacy in this universe. Even Apollo presumably couldn't guess quantum randomness. And so, maybe he couldn't guess your thoughts, because without some indeterminacy I don't see how there can be such a thing as free will.

So the big question to me is, where and to what degree, if at all, does real randomness exist? If it doesn't exist, then presumably there is also no free will, although some "compatibilists" believe that determinism and free will are, well, compatible. From the little I've understood of their arguments (made by truly brilliant people like David Hume and many others), there's usually some sneaking in of a bit of indeterminacy, a bit of ontic randomness, when we (and probably they also) have a hard time noticing it.

So anyway, that's the little I understand of randomness, but it's deeply interesting. I wish the article had been also. What I wish I could understand better would be how much some quantum randomness could effect our larger world, and if it could affect our neurology enough to grant us free will. Hmm... And now, back to work... ;)




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: