
Where Quantum Probability Comes From - eaguyhn
https://www.quantamagazine.org/where-quantum-probability-comes-from-20190909/
======
n4r9
Carroll is a big advocate of the Many Worlds Interpretation, so it's nice to
see some other interpretations getting a decent hearing in this article as
well as a decent treatment of some academic concerns about MWI. That's a
testament to his humility as a researcher.

I must admit I struggle with the MWI Born rule derivations based on rational
credence. I don't see why proving that one ought to assign credence in such
and such a way is sufficient to prove that that's the way nature _is_. It
feels too much like deriving an "is" from an "ought", although in a slightly
different way than what Hume objected to!

~~~
crdrost
Yeah, I mean, at this scale you’re not going to prove how nature _is_ from any
of this. What MWI is saying is that if you admit a really difficult view of
the cosmos, then you can regard the Born rule as part of Schrödinger
evolution—but it is kind of only possible because you punted the details into
that difficult view of the cosmos. (And then the MWI-supporters come in full
force and argue that actually the view is not so “difficult” after all.)

Backing up, the problem is that you have these different approaches—depending
on how you count them you could maybe group them broadly and say there are
3-4, or as large as a dozen or two—that are all mathematically identical. They
all predict the Born rule, but suggest different ways that Nature really
fundamentally would act to produce that rule.

Since they are mathematically identical, it is _provable_ that there is no way
to choose between them. As a result, the one which steps the most out of
peoples’ ways to enable experimental results, has better “genes” for its own
reproduction in the publication of papers. And that has just been the
Copenhagen interpretation: you the experimentalist have a soul and when that
soul measures the world the nice unitary evolution of the world comes crashing
down with probabilities given by the Born rule. Contrast with pilot-wave
theories where you have to work out a whole separate equation that doesn’t
_do_ anything which makes any further observational impact.

The basic issue that we are facing is that while the notion of souls seems
laughable for fashionable sciencey people, it also seems in some distressing
way _inevitable_. You take Many Worlds for instance, you admit the reality of
every single possibility of the entire universe as a much broader multiverse.
An equation, the Schrödinger equation, essentially works over, say, a Planck
time to create a vector field on top of this, saying “This instant is followed
by that instant is followed by that instant.”

In the middle of that, what uniquely qualifies _my_ experience _here and now_
as I know such experience must exist? I do not perceive a multiverse; I
perceive a changing universe. And MWI says “well actually there are a million
yous, frozen in time, all perceiving changing universes. Your experience of
motion through time is actually kind of a lie.” This is not a unique problem
to QM; it happened much earlier with special relativity where we discovered
that you are actually a rope of worldlines thrusting through a static four-
dimensional Lorentzian manifold, every part of that rope being presumably in a
separate conscious state, perceiving itself as moving through time but we “on
the outside” can see that there is no unique present defined such that it can
wash over all of the ropes simultaneously. MWI just happens to facilitate the
same basic “unrolling of time” because it has already unrolled all of
possibility-space. And to fix it you need something—I’m calling it a soul, you
can get fancy—which “zips along the worldline” and contains my conscious
experience, or you need to argue that my experience is an illusion, or you
need some “universe-soul” to act like a coherent “present moment” for all of
us, or the like. It all kind of sucks.

There is a nice perspective sitting in the middle of this due to Andreas O.
Tell [1], and it is pleasantly agnostic while still doing something like what
MWI is doing to try and derive the Born rule from normal wavefunction
evolution. In brief, he says: “use the state-matrix formalism for QM, and take
a completely agnostic view of what the cosmos is and how it behaves. Still if
you have a local information-processing system which is embedded in the cosmos
and changing, then it receives information and must update its model of the
universe. Its model of the universe must necessarily come down to a list of
wavefunctions with a list of weightings, but there is a freedom-of-perspective
which allows you to choose the wavefunction with the highest weight as “the”
one that you think the system is in. New data just forces these weights to
cross in size, causing the Born rule when you try to determine whether those
weights will cross over and the system will be in the new state.”

In some sense then we can live in a very Copenhageny world where we are
changing data-processors uniquely present in some space and time, but we can
use Schrödinger evolution to derive the Born rule just the same as the many-
worlds interpretation does, but rather than committing to its plethora of
different universes we might be able to just remain non-committal about what
is in the rest of the universe, beyond what I see in it.

[1] [https://arxiv.org/abs/1205.0293](https://arxiv.org/abs/1205.0293)

~~~
n4r9
I have to be brutally honest here and say that I'm not inclined to spend much
time understanding in detail a 16 page paper from someone who has no
recognisable affiliation and a single arxiv submission that doesn't seem to
have garnered much attention. Having skimmed it and what you've written, I'm
quite confused about how the ontology differs from Many Worlds or what it
means for "weights" to "cross in size". I think the author concedes some of
the difficult in positing a "dominant reality" when they say

> Undetectable to the observer, different alternate realities can fight for
> becoming the dominant one, at least over a short period of time. This effect
> appears to be highly unsettling and not really greatly preferable to the
> world-splitting n the Everett interpretation

~~~
crdrost
I mean I suppose it’s up to you whether you “recognize” the Universität
Konstanz, but it’s a rather large place.

Tell himself is indeed not working in the field anymore—this preprint was
submitted to at least one journal as I understand but was not accepted before
the grant was up or so and he could no longer keep pushing for publication;
instead he went into acoustic signal processing with a friend, and they both
started a company called SoundTheory now. Something about trying to maximize
the information “punch” of music to your brain makes it sound better or
reduces background noise or something.

The paper is still interesting on its own merits though. I mean, it’s
interesting to me; of course your mileage may vary.

------
nabla9
This is related to his research. His latest podcast goes to it more deeply
(there is a transcript):
[https://www.preposterousuniverse.com/podcast/2019/09/09/63-s...](https://www.preposterousuniverse.com/podcast/2019/09/09/63-solo-
finding-gravity-within-quantum-mechanics/)

The goal is to derive emergent spacetime and gravity from quantum mechanics.

Some features of their theory:

* Finite dimensional Hilbert space. Quantum field theory gets boot.

* Spacetime is entangled degrees of freedom in a way that semi-classical spacetime geometry emerges. Things are local because they are entangled not the other way around.

* Spacetime expands because initially untangled degrees of freedom become entangled with the rest of the the universe.

~~~
v77
His book just came out this week as well.

------
danbruc
_In contrast with frequentism, in Bayesianism it makes perfect sense to attach
probabilities to one-shot events, such as who will win the next election, or
even past events that we’re unsure about._

Does frequentism really require actually performing the experiment? Or is
imagining doing the experiment good enough? I would say

    
    
      »Candidate X will win the next election with a probability of Y percent.«
    

is just a shorthand for

    
    
      »The following sets of states and possible evolutions of those states are
      compatible with my knowledge about the world and in Y percent of the cases
      candidate X wins the next election.«
    

which seems not to different from a coin flip where the different outcomes are
also due to imperfect knowledge of the initial state. The difference is that
it is easy to sample the set of initial states for a coin flip by just
repeatedly flipping a coin from slightly different initial states due to human
imperfections in doing this task. Sampling the initial states of an election
in the same way is obviously not possible and I have admittedly no real clue
how people arrive at a meaningful number in practice. A similar example seems
to be the probability of rain at some place some time into the future in which
case it is possible to sample the set of initial states by running a weather
model repeatedly.

~~~
6gvONxR4sf7o
I interpret it as a reference system. To a frequentist, a probabilistic
statement is "this coin flip is 50% heads in reference to this set of coin
flips." Your elements _of the statement_ are the event, the probability, and
the reference set of events.

------
EGreg
Isn’t space and time considered continuous?

What about the Planck distance then? What’s that all about?

It seems to me that on a microscopic level and small time scales, a small
change in input will lead to a small change in output.

This is certainly true in classical mechanics, but what about quantum
mechanics? Are the quanta the result of a continuous process? Can an subatomic
particle wind up on mars, exceeding the speed of light? with a certain
probability?

HERE is what bothers me. The instability of certain physical problems (small
change in input leads to large changes in output, like where a pencil is going
to fall if stood on its tip). How can this happen if the composition of
continuous functions is continuous???

In mathematics we have abstractions such Real Numbers and infinite sequences
of functions that can converge to discontinuous and even really weird
functions in the limit.

But in the real world it seems that we have some sort of minimum, like planck
distance, or simple measurement error, that preclude us from reversing a
process after a certain point. Maybe THAT is where unstable problems on the
macro scale come from??

Pilot Wave Theory seems to say that everything is deterministic and the
uncertainty in Quantum Mechanics comes from us being unable to observe the
process that leads to the result. But PWC requires us to abolish the idea of
locality, which to me is a special case of continuity.

Anyway can someone please explain this to me? As it regards quantum mechanics?
Leslie Lamport’s paper caused a big watershed moment for me and I’m still
reeling from it:

[https://lamport.azurewebsites.net/pubs/buridan.pdf](https://lamport.azurewebsites.net/pubs/buridan.pdf)

------
macawfish
I'm not an expert, but one of the most exciting realizations I've had over the
last few years is just how close quantum theory is to various "ordinary" kinds
of probability theory, including Kolmogorov's classical theory. Now
probability theory is not so boring for me.

Quantum physics has inspired so much work in other fields! Check out this
guy's work for examples:
[https://scholar.google.com/citations?user=wdhkzPMAAAAJ&hl=en](https://scholar.google.com/citations?user=wdhkzPMAAAAJ&hl=en)

I don't agree with Khrennikov's interpretation of quantum mechanics (he's a
realist whereas I tend to appreciate the more "mystical" feeling
interpretations of quantum mechanics), but he and others' work on the
connections between quantum physics and classical probability theory, as well
as on non-physics applications of quantum theoretic tools, is crazy thought
provoking.

------
canjobear
If you'd like an intuitive introduction to the actual technical details of
quantum probability: [https://www.math3ma.com/blog/a-first-look-at-quantum-
probabi...](https://www.math3ma.com/blog/a-first-look-at-quantum-probability-
part-1)

------
grumpy8
I love this website so much (quantamagazine.org). The design is great and the
articles are amazing.

------
laser
There's something that has deeply irked me for many years about these MWI
probability constructions, and that is the largely glossed over fact that
there's sort of a non-local numerical awareness and computation within the
wave function necessary to construct the number of branches in proper ratio
required to maintain self-consistency in the MWI universe. Additionally, this
number of branches is incomprehensibly larger than simply splitting the
universe once for every quantized event, and results in unwieldy levels of
duplication of identical branches.

The reason for this is that if we take the most improbable outcome of a given
wave function and say “This highly improbable branch occurs once”, we are
immediately contradicted, as the next least improbable event is virtually
certainly a non-integer ratio to the former. So, we give the wave function
numerical factoring / self-resolving capabilities and instead, the least and
second least improbable branches occur the number of times necessary to
maintain status as whole integers with correct relative ratio. But then, that
only resolves two possible events on the wave function, and so with the third
least improbable event, almost certainly not an integer ratio to the first or
the second, we must repeat this step again of multiplying the number of
branches for the least and second least improbable, to maintain a consistent
integer ratio for our types of branches. As you can see, as you follow this up
through all the possible branch outcomes, to express their corresponding
probabilities in whole integers counts of quantum outcomes, you essentially
have to engage in a massive computation of finding common factors all the way
up. Further, even the least improbable event will require an incomprehensible
number of duplicate branches, and the most probable events will have an even
more innumerable count of duplicate branches still.

The only way I can see to escape this madness with MWI seems to be give up on
the notion of truly separate branches, and instead treat these “many worlds”
as a stream of overlapping world-ish-nesses in which discrete outcomes don’t
actually even exist, but then you have seeming contradictions in observable
discreteness and it’s not clear it’s truly even MWI anymore.

Disclosure: I’m not a physicist, and it's quite plausible that I don’t know
what I’m talking about.

~~~
madhadron
> The only way I can see to escape this madness with MWI seems to be give up
> on the notion of truly separate branches, and instead treat these “many
> worlds” as a stream of overlapping world-ish-nesses in which discrete
> outcomes don’t actually even exist

That's what it is. A measurement is coupling a quantum event to a
statistically irreversible process. The total wave function that results has
two major lobes. There's no split on measurement. That's why it's appealing:
it makes no reference to classical mechanics in the formulation.

------
Simon_says
In a shocking turn of events Quanta Magazine features an author who’s heard of
the Many Worlds Interpretation.

~~~
mercer
If I were to write an edgy/snarky comment bot for HN, I imagine its comment
history would resemble yours to a tee.

That's not particularly a criticism nor a compliment, btw.

~~~
Simon_says
Nice. You’re a better programmer than I, though.

~~~
mercer
What makes you think that's the case?

~~~
Simon_says
It follows from you programming snark as well as I think it. I’m sure it’s
possible, but it seems like a hard problem to me.

~~~
keiru
I wonder if one could earn a living as a professional shitposter for AI
training purposes.

