> [George Pólya] dissected our intuitive "common sense" into a set of elementary qualitative desiderata and showed that mathematicians had been using them all along to guide the early stages of discovery, which necessarily precede the finding of a rigorous proof. The results were much like those of James Bernoulli’s "Art of Conjecture" (1713), developed analytically by Laplace in the late 18th century; but Pólya thought the resemblance to be only qualitative.
> However, Pólya demonstrated this qualitative agreement in such complete, exhaustive detail as to suggest that there must be more to it. Fortunately, the consistency theorems of R. T. Cox were enough to clinch matters; when one added Pólya’s qualitative conditions to them the result was a proof that, if degrees of plausibility are represented by real numbers, then there is a uniquely determined set of quantitative rules for conducting inference. That is, any other rules whose results conflict with them will necessarily violate an elementary—and nearly inescapable—desideratum of rationality or consistency. But the final result was just the standard rules of probability theory, given already by Bernoulli and Laplace; so why all the fuss? The important new feature was that these rules were now seen as uniquely valid principles of logic in general, making no reference to "chance" or "random variables"; so their range of application is vastly greater than had been supposed in the conventional probability theory that was developed in the early twentieth century. As a result, the imaginary distinction between "probability theory" and "statistical inference" disappears, and the field achieves not only logical unity and simplicity, but far greater technical power and flexibility in applications.
This is wrong. A random process (gene mutation), within a randomly changing environment - is a 'net random process'.
The author tricked himself.
The purely physical/materialist perspective does have this paradox: we're just purely random bags of particles, and there actually cannot be any such thing as 'intelligence' or 'life' or 'love' or 'language' - just the appearance of it.
If you throw a bag of a trillion^trillion^trillion particles into a purely self contained environment (i.e. the Universe), and let it stir for a while - whatever is going on - at least from a materialist perspective - is random.
Scientists realize this and there's a new field of thought called 'emergence' which at least tries to grapple with it 'one step' away from materialism, and they toy with the idea that properties of complex entities may 'emerge' independent of their more simplistic constituent parts.
"random mutation is an important element in evolution, but less important than the process of natural selection, which is not random at all"
So their point is that random mutation occurs but is then filtered non-randomly, like a high-pass filter on noise. It's still a bit of an overstatement to say "not random at all" since there is obviously some (bad) luck in "fit" individuals dying or "unfit" individuals surviving. In the long-run though, you can see their point.
They go on to make a ridiculous claim that because you are here, nothing random happened to your ancestors. I would say it's just one possible history, and perhaps everyone makes mistakes with randomness.
How about if you filter every digit that is not exactly one more than the last unfiltered digit?
This seems to lead to a meaningless/useless definition paradox. 'Randomness' comes to mean everything and nothing. This is one of the things that led me away from an interest in the materialist/physicalist position. Like you said elsewhere, it seems hard to swallow the idea that all that we can achieve in defining 'randomness' is a negative definition. Personally, I find it problematic that materialism has no positive definition of what 'randomness' is and does not seem capable of offering a meaningful/useful definition.
I doubt you'll find any better usage of randomness in other fields, and in many ways, "random" is used to censor things our theories can't represent, and wrap them in stochastic approximators. This is useful, because we can then use the stochastic models as a bound on the possible outcomes caused by things we can't model (for various reasons). This let's us calculate useful predictions involving things we don't or can't know.
I don't think randomness is a negative definition, just that our primary usage of the concept is boxing up unmodelable things.
Nietzsche's "eternal return" posited just this type of situation, a limited (though huge) amount of stuff, and a limitless amount of time, well with those two you'd relive your life an eternal amount of times, presumably.
As an aside, I personally find a warming way of thinking of this idea by fictionalizing it to imagine that "I" might live a future life of every other person and even animal. Maybe by thinking like this I will be more likely to treat them as ends in themselves, and not as a means to an end, that's to say, closer to Kant's "categorical imperative". That would be nice.
From a Newtonian/Classical perspective - yes.
But we're at quantum physics now - so no.
Every single particle interaction is basically 'inherently random'.
Also - what you are saying would apply to 'us' as well, that we are all deterministic ... again, because we're living quantum (as far as we know) - no, it's random.
Purely random is where materialism stands now.
I should add that materialism is just one metaphysical presupposition.
Well, maybe not! It's possible that indeterminism may be found in classical mechanics; the classic example is Norton's Dome, discussed in  with another reference in . If you read , you'll see that there isn't yet any general agreement as to why its paradox is invalid. That's a potentially fruitful avenue for research, methinks.
I completely agree that the classical determinism I mentioned also applies to us. One of my questions is whether and then how and then to what degree the randomness from quantum events can allow free will and then free action.
So, I wonder how much quantum randomness affects things. For example, atomic clocks face indeterminacy but are all the same the most accurate clocks we have, by far. Of course, once you add things like eternities, this can surely get very different... But 13 billion years is very far from an eternity...
Also I do understand that materialism is only one of many metaphysical suppositions. But it's the one that we can seem to do the most with. Hard to do much with metaphysical idealism, for example...
Anyway I'm no scientist or mathematician or philosopher so I'm out of my depth here. I'd be interested in your thoughts though. Thanks ~~
But I see your point about how some macro-level things seem to be that way.
Two popular contexts for which this works are (a) I am attempting to predict an event and cannot do so deterministically using information I and resources R or (b) I am attempting to characterize a sequence of events in an efficient way. Thus, randomness is thus always determined against a resource constraint.
For instance, the digits of pi are random in the context where you aren't aware that the digits are arising from a deterministic process and don't have the resources to discover this.
Another interesting example concerns the randomness of a PRNG. Given knowledge of the algorithm and the hidden state it is obvious that this is deterministically predictable, but eliminating the hidden state information destroys the predictability of the PRNG events.
For the sequence of events modality, efficiency is important since we've lost the context of discovery/prediction. For instance, any finite sequence can obviously be perfectly described by itself, but we're often interested in compression. One can imagine an optimization problem where we want to achieve a representation of the sequence which has the simplest model and the lowest cumulative "error". Error and model can be defined in many ways, but oftentimes models have a "random" nature to them. It's an efficient shorthand in situations where the deterministic explanation for a sequence is difficult or impossible to describe.
I remember a friend and I used to play with this chapter while in college, complaining how bad the random number generator in Pascal was because it could not pass a cube test (get 3 random values as x,y,z points and plot them in a cube - after 1000-2000 iterations the cube showed a lot empty places - some PRNGs even showed checkboard patterns).
Does this mean that randomness is just our inability to "predict"? If we could see the future just as clearly as we see the present and the past, would there be randomness?
This is basically the approach taken by information theory: one of Shannon's many insights is that the content of a message doesn't matter for communicating it; only how well the receiver can predict it. If there are two possible messages, it doesn't matter if they're "yes"/"no" or "invade Portugal"/"bring me a sandwich", the information being communicated is the same (1 bit).
In this way, something is "random" if it's maximally unpredictable, i.e. if it can't be compressed. If we pick a Universal Turing Machine as the receiver, we get the characterisation of randomness given by Kolmogorov complexity.
Kolmogorov randomness defines a string (usually of bits) as being random if and only if it is shorter than any computer program that can produce that string. To make this precise, a universal computer (or universal Turing machine) must be specified, so that "program" means a program for this universal machine. A random string in this sense is "incompressible" in that it is impossible to "compress" the string into a program whose length is shorter than the length of the string itself. A counting argument is used to show that, for any universal computer, there is at least one algorithmically random string of each length. Whether any particular string is random, however, depends on the specific universal computer that is chosen.
According to Information Theory, the measure of the information content of a message is its unpredictability.
This seems to imply that randomness is information. (This is also supported by Kolmogorov, randomness is less compressible, therefore must contain more information.)
An interesting question is, where does this information "come from"?
And yes, if the omniscient God can predict everything He is an observer to Whom nothing is random.
Now, is He trapped in a static Universe, or can He effectively "stir the pot", and bring "new" information/randomness into existence? But at this point the discussion moves from mathematics to philosophy and right into theology, so ... um... yeah...
Numbers, eh? What a trip.
I just realized this comment is basically just rehashing what chriswarbo said... :-/
Basically is randomness a human idea/concept or does it really exist? This is sort of along the lines of whether or not Math really exists (mathematical structures) or is just a human way of modeling the universe (Mario Livio and few others have written some novels on this).
I don't know the answer but it seems to have a habit of getting philosophical rather quickly.
Interacting with something in a superposition creates entanglement. In practice most of you is in contact with most of the rest of you, so any superposition quickly spreads to your whole body and more generally the wider universe. At that point we can call it "decoherence" but there's no collapse. Just your wavefunction becomes cleanly separable into distinct pieces. But the subjective experience of being in either component would feel, well, normal.
- Random: it's 1/6 for each result, but I don't know which it will be.
- Uncertain: I don't know even how the distribution is, or what the variables are.
Random can also be broken down into "there's no way with any information you can say what the outcome will be" and "you don't have the information to the precision you need, so to that extent it's unpredictable". So for instance there's non linear dynamic systems where teeny tiny variations cause the outcomes to vary a lot. They're actually deterministic, but they feel random. With quantum stuff like decay, you actually can't know when exactly it happens, but you can say something about the distribution and how the distribution is affected by various laws. For instance, there's a famous muon decay experiment where special relativity changes the decay rate.
Most of us have an intuitive sense that random things are evenly distributed, which is true in the very very long run, but not true at all on the scales we generally experience things.
This is not true, actually. A random variable can have an even distribution, but it doesn't have to. It might have a normal distribution, or even some kind of skewed distribution.
In earlier times if you rolled dice and won, you would think "the gods favor me today". Nowadays, you think "I am lucky today". In almost all cases where an analogous situation can be found historically, randomness (sometimes combined with extremely long periods of time) replaced God as an explanation.
Also, already there are people who worship RNGesus in the extremely rudimentary virtual realities we can construct.
As for 'the numbers of PI' - well, they're purely random in some contexts, certainly not in others.
Though it may difficult to produce - surely, there exists a theoretical random number generator that produces numbers which cannot be predicted in any context, and is therefore 'truly random'.
I believe some quantum interactions are, as far as our current understanding, 'random' in this regard, no?
Tell me more, what do you mean by this? If the context changes the outcome, doesn't that imply some predictive power?
> Though it may difficult to produce - surely, there exists a theoretical random number generator that produces numbers which cannot be predicted in any context
Attach a Geiger counter to a computer, and you have a non-theoretical true random number generator. Is that what you meant, or are you talking about software RNGs?
One of the challenges with physical inputs is ensuring the RNG output is uniform over time, e.g., equal chances of numbers between 0 and 0.5 as between 0.5 and 1.
By that definition, nothing is truly random - it just depends on the ability and knowledge of an observer in a given context.
You say 'geiger counter is random'. Well - what if you had a scanner, and a powerful computer, and knew the arrangement and composition of the material, and could 'predict' on some level, when those atoms would produce radiation (I know this is probably impossible) - but suppose you could. Then it would not be 'random'.
I wonder if we can consider 'randomness' as an inherent quality of a system ...
Though I admit 'something that is not predictable' is a nice way to communicate it, maybe even help regular people understand it better.
As far as we (humanity) know, true randomness is still a thing. Quantum effects have no known predictors. From the article I linked: "Quantum mechanics predicts that certain physical phenomena, such as the nuclear decay of atoms, are fundamentally random and cannot, in principle, be predicted"
So Quantum science believes that true random does exist. Until someone demonstrates otherwise, or I start working on the problem, I'm good accepting that for now. The tests to demonstrate these ideas are fascinating: https://en.wikipedia.org/wiki/Bell_test_experiments
In the mean time, there's the very real concept of whether something is practically predictable, and a Geiger counter is currently not practically predictable. (And worth noting, it lands into the category of currently not theoretically predictable too.)
Personally, I feel like the definition "unpredictable" is quite good, especially for regular people. Going deeper than that requires all kinds of baggage and explanation. But the point the author made is very good -- people tend to start expecting things when "randomness" is involved, and that expectation is a problem, because randomness is unpredictable.
First of all, that statement doesn't really make much sense. By that definition, everything that you are unable to predict is random (to you), so, assuming that there are things that you are unable to predict, there are things that, from your perspective, are truly random, by that definition(!).
Unless, of course, by "truly random" you mean "random as per the definition that I prefer", in which case all you are really saying is that that definition differs from your preferred definition, which might be true, but isn't really much of an argument.
> I wonder if we can consider 'randomness' as an inherent quality of a system ...
No, that would be a useless concept. Randomness as an inherent quality of a system would be equivalent to the claim that we will never be able to predict the respective attribute of the system, i.e. it's a claim about the impossibility to ever know something. It's a perfectly valid concept, in that there might be things that we will never know, but it's utterly useless because we will never know that we won't ever know them, i.e., it's undecidable which elements are in the set and which are outside, unless we figure out how to predict them, in which case they don't fall under either definition anymore. The only thing we can say is that we don't know how to predict something right now.
The closest you can get to "objective randomness" is something like bell's inequality--but even that is falsifiable in principle.
Quantum theory specifies that this is impossible not just in practice but in principle. Radioactive decay on an individual-atom level is literally impossible to predict, and not just because we don't have sensitive enough measurements. "Arrangement and composition" don't have their standard meaning at atomic scale; particles are probability fields, not BBs.
>Randomness is the lack of pattern or predictability in events.
It may be a better phrasing.
Are you sure? :P
I'm no mathematician or physicist, but I've thought about randomness a bit, though I don't have a satisfying understanding of it.
First, we probably (haha) need to at least distinguish between epistemic randomness -- an outcome is unpredictable because we the observers have imperfect knowledge -- and real randomness, where an outcome is simply unknowable, period, and therefore undetermined and separate from the laws of classical physics. (That's where the quantum part comes in.)
A roll of the dice is epistemically random -- we don't know how it will land because we can't get and calculate accurate data quickly enough. Presumably a god could. Apollo, the god of prophecy and Delphi seems fitting here. Don't go playing dice with him. He sees the dice in mid-air and knows on what atoms they'll land, what the windspeed is, the exact weight and spin, he saw the thrust of your fingers as they tossed the dice, and he can call out the result before they land and fall still. Indeed, if the universe is deterministic, he could have known since the beginning of time what you'd roll today. Lord Apollo would simply consult the great chain of cause and effect. Likewise, he'd know the outcome of all of your pseudorandom generators. The article seemed to deal only with epistemic randomness, right?
Then there's the "real" randomness that is also called ontic (as in ontological), or quantum randomness. According to quantum mechanics if I've understood it correctly, electrons decay at _truly_ random intervals, intervals that have no cause whatsoever. This is perhaps to be disputed, or will be disputed, but hey, I'm happy that there's at least a chance for some indeterminacy in this universe. Even Apollo presumably couldn't guess quantum randomness. And so, maybe he couldn't guess your thoughts, because without some indeterminacy I don't see how there can be such a thing as free will.
So the big question to me is, where and to what degree, if at all, does real randomness exist? If it doesn't exist, then presumably there is also no free will, although some "compatibilists" believe that determinism and free will are, well, compatible. From the little I've understood of their arguments (made by truly brilliant people like David Hume and many others), there's usually some sneaking in of a bit of indeterminacy, a bit of ontic randomness, when we (and probably they also) have a hard time noticing it.
So anyway, that's the little I understand of randomness, but it's deeply interesting. I wish the article had been also. What I wish I could understand better would be how much some quantum randomness could effect our larger world, and if it could affect our neurology enough to grant us free will. Hmm... And now, back to work... ;)