
Physicists want to rebuild quantum theory from scratch - cgmg
https://www.wired.com/story/physicists-want-to-rebuild-quantum-theory-from-scratch/
======
noobermin
All I ask is the headline not be "Physicists want..." but rather "Some
physicists want..." A couple of people I've never heard of don't speak of all
physicists, in fact most

>[...] just shrug and say we have to live with the fact that quantum mechanics
is weird.

In fact, I don't think it's weird, I think it's reality.

~~~
xelxebar
> In fact, I don't think it's weird, I think it's reality.

At the risk of sounding snarky, I think it's useful to point out that we call
it the Standard Model and not the Standard Reality.

Model ≠ Reality, yadda yadda.

We have an immensely useful model of fundamental physics as long as the
energies are low enough. However, I feel that we lose a lot by letting that
utility stop us from asking epistemic questions of the model itself.

Historically, math has gained a lot by taking such questions seriously,
especially in the areas of logic and computation.

~~~
platz
> Model ≠ Reality, yadda yadda.

I think im a fan of what is described as quantum bayesianism in the article,
that quantum mechanics is basically a probabiltiy theory, and this probability
theory is required to express our _expectations_ from measurements (quantum
mechanics encodes our beliefs about what will be observed given some knowledge
about a system) and this is fundamentally a "retreat" because we dont know how
to describe the underlying reality. even though calculating the probabilitoes
may be very precise.

but i am havig trouble distinguising quantum bayesianism from the standard
"model vs reality" fare as the motovation seems very similar

~~~
noobermin
The thing I say is "what underlying reality?" If you can't measure it, how can
you make a theory out of it or research it (or publish papers on it unless
you're a string theorist)? If something is not measurable, then how can you
claim it exists or not?

Whenever people ask me these "deep questions" after I tell them I'm a
physicist, I tell them that. Science is all about the reality you can measure,
not about meaning and what lies "deeper".

~~~
platz
I take it you are a fan of the copenhagen interpretation of quantum mechanics.
its the right answer for making calculations. but how do come up with a new
theory? you have to tell a story or envision an underlying reason to justify
it. new theories dont come out of nowhere.

this is also like saying the universe in the time of newton consisted of only
newtons laws, which is untrue. if the universe _is_ science, then you have to
say the universe is changing as science changes. unless the universe really is
just made of published journal papers.

------
Koshkin
Problem is, all these redefinitions of quantum mechanics via "quantum logic"
or "quantum probability theory", while looking more fundamental and formal
than the usual theory based on Schrödinger's equation and Born's rule, do not
make it any more intuitive and do not allow anyone to say, "aha! now I get
it!". This is because these new foundations themselves are just as unintuitive
as quantum mechanics...

[Edit] On the other hand, I find the famous derivation of Schrödinger's
equation by Feynman illuminating: [http://fermatslibrary.com/s/feynmans-
derivation-of-the-schro...](http://fermatslibrary.com/s/feynmans-derivation-
of-the-schrodinger-equation).

------
siglesias
It will be interesting to see if theoretical physics and computer science
converge at all over the next century. Intuitively there’s much to suggest
that our understanding of computational limits on things (not to mention the
idea that the world itself is a simulation of some kind) has analogs in
fundamental physical limits (speed of light, quantum noise).

~~~
f00_
I would be interested if anyone knows about the connection between information
theoretic entropy and physical entropy.

I think thermodynamics and statistical physics are really interesting. In
Parallel Distributed Processing they discuss an extension of the entropy
metaphor to learning and neural networks

[https://mitpress.mit.edu/books/parallel-distributed-
processi...](https://mitpress.mit.edu/books/parallel-distributed-processing)

~~~
grondilu
> I would be interested if anyone knows about the connection between
> information theoretic entropy and physical entropy.

I'm pretty sure they're the same, with minor differences in notation and
units, unless of course I misunderstand what you mean by "physical entropy".
Entropy in physics only got a precise definition once physicists understood
that the concept really comes from information theory.

~~~
f00_
erm, I think boltzman figured out entropy for thermo like 100 years before
shannon's theory of communication

could be totally wrong though

entropy, complexity, and evolution are words that come to mind

~~~
vitus
Yes, Shannon explicitly stated in Mathematical Theory of Communication that
the general form was the same as that in statistical mechanics:

"The form of H will be recognized as that of entropy as defined in certain
formulations of statistical mechanics where p_i is the probability of a system
being in cell i of its phase space. H is then, for example, the H in
Boltzmann’s famous H theorem."

where the general form was H = -K\sum p_i * log(p_i)

IIRC, this form is the only kind with the following two properties:

\- chain rule -- H(X,Y) = H(X) + H(Y|X)

\- maximized for the uniform distribution.

~~~
f00_
Could you explain KL divergence and/or cross entropy to me?

from wikipedia: "In mathematical statistics, the Kullback–Leibler divergence
is a measure of how one probability distribution diverges from a second
expected probability distribution"

Kullback-Leibler divergence is used a lot in the reinforcement learning
setting

"In information theory, the cross entropy between two probability
distributions p and q over the same underlying set of events measures the
average number of bits needed to identify an event drawn from the set"

Cross Entropy is used as a cost function in neural networks rather than least
squares
[http://neuralnetworksanddeeplearning.com/chap3.html#the_cros...](http://neuralnetworksanddeeplearning.com/chap3.html#the_cross-
entropy_cost_function)

------
girzel
I found this article's explanation of the basic state of physics to be one of
the best I've read. I know squat about the field, and so can't say if it was
accurate or not, but the layman's explanation of the core problems of quantum
theory, the probability and whatnot, was easy to grasp, and made sense.

------
erik_landerholm
Information theory will hopefully give us deeper insight into the why. If we
derive QM from information theory it might come at it from another angle and
provide different insights. I had read about people trying this in the past,
but not for a while...maybe it proved fruitless or uninteresting.

------
rocqua
I'm surprised time and time again by the removal of causal ordering from
quantum mechanics. It seems to be throwing away so much.

------
kodfodrasz
So this is a project to create the "Maxwell equations of Quantum Physics"?

------
taw55
What if quantum mechanics are the causal basis for spacetime. Then to the
observer it might appear as random, because spacetime is a prerequisite for
observation. Also for causation, this must imply time moves sideways in the
quantum domain. Clearly!

I clearly had one to many to drink. My head hurts.

