> But fortunately, I knew computer science, which most physicists do not know, with the Church-Turing thesis, which roughly states that anything physical can be simulated by a computer.
But that is not what the Church-Turing thesis says. A Turing machine cannot mimic a truly random physical process, pretty much by definition.
> This means that observers can be simulated by a computer.
Except that nobody really understands what kinds of physical interactions qualify as measurements -- or even if that is the right framework to use at all.
It might not completely solve the measurement problem, in that the purported derivations of the Born rule are hotly contested, but neither does any other interpretation - it gets further than most. I don't see anyone else except pilot wave theory (which violates locality) deriving the Born rule either.
"measurement is not special, and observers are just quantum systems like any other" seems like it ought to be the default assumption to make, in the absence of any evidence to the contrary, and honestly I think the reputation of MWI as a bit crazy comes about for purely historical reasons. Namely, Bohr didn't like it and he held a lot of sway.
Physicists are human, and parallel universes is pretty strange, so most people reject the idea without thinking about it in much detail.
Most physicists' opinions shouldn't count for much anyway, because the measurement problem simply doesn't come up in our day to day work, so most haven't thought about it much.
Furthermore, plenty of physicists don't actually grasp the fundamentals of their own field - they specialise a lot and can use specialised theories to get what they need done without understanding in detail where it came from.
So I really think the decoherence people, as cranky as their website looks and as oddly as they write, ought to be the ones whose opinions count on the matter.
Edit: seriously, check out their website, straight outta the nineties:
I once met a professor at a quantum chemistry conference who argued with me that I could not converge on the exact eigenvalues of a helium atom (assuming a simplified Hamiltonian with a few Coulombic terms). He stated the oft repeated mantra that "the Schrödinger equation can't be solved for any element other than hydrogen", so I wrote a program that brute force diagonalized the Hamiltonian to demonstrate otherwise. He was confusing the notion of a closed-form solution (an analytical expression that gives the solution in terms of specified elementary functions) with that of a numerically exact solution (one that converges on the exact solution but can't be expressed in terms of specified elementary functions).
Another professor who was teaching a course on statistical mechanics once said that the single particle wave function is more fundamental than the multi-particle wave function. Nevermind the fact that his research involved density functional theory, which famously fails on those cases where the molecular wave function can't be well approximated using a product of single particle wave functions.
Depends on what you mean by "multi-particle wave function". The way it is usually understood (I think), it includes all possible tensor products of single-particle wave functions. Then it should be possible, shouldn't it?
But I'm possibly just misunderstanding what you're saying.
This. Sometimes physics does feel like a religion. The fundamentals (including basic assumptions and "proofs" that are complete bs in the mathematical sense) are getting reiterated time and again till people actually start believing in them. From what I've experienced myself it basically works like this: In the beginning, you come across some claim or fundamental assumption you can't follow and make a mental note to follow up on it and sit down to fully understand it. Then you get sidetracked and next time you come across said claim, you're like "Damn, I wanted to look that up!" and here you are, making a mental note again… Now somewhere between the fifth and tenth time this happens, you've forgotten about your mental note completely and have just accepted the claim as fact.
It's basically the brain tricking you into thinking the claim is correct, the more often you read about it. It's like habitual behavior you don't actually process consciously anymore.
Physicists aren't dumb, but they just haven't looked into it more because it's surprisingly not relevant to the majority of work. So they're just repeating the confusion of past generations as if it's fact.
But, particularly in my field, we're getting seriously good at manipulating more and more exotic quantum states. I'm working on things involving 'weak' measurement, where you really need to know what counts as a measurement or not, and how much - you can do a partial measurement and only partly (appear to) collapse a wavefunction. These types of experiments are becoming more are more common, and it's happened a few times now that I've heard people saying "hang on, what counts as a measurement in this context?" and then having to read a decoherence textbook in order to do their atomic physics.
With these sorts of things as well as quantum computers, decoherence and the measurement problem is intruding into a larger proportion of people's work, which means repeating the mantra of the previous generation isn't going to cut it anymore, people are going to actually read decoherence textbooks and decide for themselves.
Seems to me, that there are a few phenomena that appear to be more or less random. Could they be perturbations caused by activity in dimensions we cannot perceive?
But if they exist, they probably have important (but unknown) effects. And your proposal that they explain randomness is a good example of that. But to me it doesn't feel right.
Quantum randomness somehow doesn't seem to be of that kind. Perturbations from unknown dimensions might well look like randomness, much link an RNG in a computer. But that is still masquerading as a classical random process -- it would not show the weird correlations found in QM.
Like something poking through or pulling on a fabric from the other side, if such a property has, say, wavelike motion in a 4th spatial dimension, it may produce regular effects that manifest as fundamental features of our 3D space, or seem random to us if a bunch of things are bumping and deterministically interacting with each other in extra dimensions. Has this ever been considered in explaining things like  and ? Might have something to do with "dark" matter/energy too; i.e. stuff is there, just occupying other dimensions.
Decoherence is great at explaining the loss of interference. It may even address the "preferred basis" problem (though there seems to be good reason to believe that it might not). But ultimately what I want to know is why I experience just one outcome, and decoherence doesn't really help here.
MWI partially resolves that ("you don't just experience one; there are many of you!"), but not in a way that satisfies me.
It's not clear that a truly random process exists. There exist deterministic interpretations of QM, for instance. Certainly unpredictable processes exist, but that's an entirely different classification (whether a Turing machine halts is also unpredictable, but still deterministic).
Further, Turing machines can generate pseudo-random outputs that pass all known randomness tests.
"One common answer is that, in a measurement, the spin (or whatever else is measured) is put in an interaction with a macroscopic environment that jitters in an unpredictable way. For example, the environment might be the shower of photons in a beam of light that is used to observe the system, as unpredictable in practice as a shower of raindrops. Such an environment causes the superposition of different states in the wave function to break down, leading to an unpredictable result of the measurement. (This is called decoherence.) It is as if a noisy background somehow unpredictably left only one of the notes of a chord audible. But this begs the question. If the deterministic Schrödinger equation governs the changes through time not only of the spin but also of the measuring apparatus and the physicist using it, then the results of measurement should not in principle be unpredictable. So we still have to ask, how do probabilities get into quantum mechanics?"
Some good references here: http://physics.stackexchange.com/questions/295527/decoherenc...
It helps explain the loss of interference, but it does not resolve the question of why and how we see one particular outcome.
It's kind of funny how the problem keeps getting pushed to higher levels of "meta":
If you consider the experimenter and his system, measurements of (non-eigenstate) quantum systems appear indeterministic to him. However, the state of [experimenter + system] is governed by an entirely deterministic equation that follows a reversible, unitary path through time. Great! But the problem is that you then have another experimenter who measures that composite system, and the outcomes he sees likewise appear indeterministic. So now you consider the system of [experimenter 2 + [experimenter 1 + system]], and we've got infinite regress — a.k.a. the measurement problem.
At the risk of sending things off an a huge tangent, it's interesting to see physicists recognizing that an infinite regress is, at least sometimes, unsatisfactory (even though there is of course nothing incoherent per se about the concept of an infinite sequence). Physicists usually tend to give short shrift to metaphysical arguments that rule out certain states of affairs on the grounds that they would involve an infinite regress of a problematic kind. But the logic you're using to argue against decoherence as a solution to the measurement problem is very similar to e.g. Aristotle/Aquinas's argument that the causal hierarchy must have a terminus. I'm not saying that in an attempt to start an argument about God. It's just interesting to see similar reasoning used in such different domains. (And of course in neither domain is it entirely clear that the reasoning works.)
Proving self-consistency is a decent achievement, and is certainly error. But it is only weak evidence in favour of a position.
I don't understand what you mean by this.
This sort of thing only makes sense in the context of many-worlds QM, and it is amusing how many professed non-many-worlders say such things.
We often describe quantum systems using an entirely deterministic (Schroedinger) equation. But we don't know in what sense that equation describes the physical state of the experimenter + system, or in what sense it is "just" a probability model.
If you choose to include the whole thing as physical reality, then you are left with all the terms in the equation -- and thus all of Everett's multiple worlds. Fine, that is a logically coherent position. But it's not the only one.
I'm not saying I do or don't believe any of this (if anything, I'm interpretation-agnostic at the moment). I'm just pointing out that there is a contradiction in having one postulate demand unitary state evolution (the Schrödinger equation) for some ill-defined "system" while another postulate says that unitarity is broken at the system/environment boundary. While there's been plenty of attempts to work around this (e.g. https://arxiv.org/abs/quant-ph/0101012), I wouldn't say that anyone has formulated a consistent set of axioms that definitively resolves the issue.
> We often describe quantum systems using an entirely deterministic (Schroedinger) equation. But we don't know in what sense that equation describes the physical state of the experimenter + system, or in what sense it is "just" a probability model.
Agreed. It's certainly a useful model but it leaves out all kinds of interesting phenomena that we observe in practice (namely, relativistic and radiative effects). Curiously though, if you turn to QFT for a better probabilistic model, Haag's theorem (https://en.wikipedia.org/wiki/Haag%27s_theorem) implies that a universal Hilbert space representation cannot describe both free and interacting fields (a problem that the non-relativistic Schrödinger equation doesn't have!)
Is this scenario fundamentally different than the quantum system? Is this scenarion also a "problem"?
Bell's theorem rules out theories of local hidden variables. It says nothing about non-local hidden variables, or even something more mundane like determinism (sometimes referred to as "superdeterminism").
In any case, I'm not sure Bell's theorem has much impact on my question: why is one of these things (transfer of information from a coin toss) not a problem, conceptually; whilst the other (entanglement of quantum systems) is a problem, conceptually?
(Personally, I don't find either particularly troubling; just curious to know what the philosophical distinction is, without appeals to "quantum weirdness")
Yet entanglement is observed non-locally, even backwards in time or between degrees of freedom that never co-exist. Nobody has been able to create a non-local model that doesn't require fine tuning and the idea goes against the spirit of special relativity.
Thus, I would agree with you, there is no "problem" with entanglement. Except non-locality.
agreed. dechoherence doesn't explain particular outcomes. It explains the scale & magnitude of mixing quantum states from different systems.
I thought "what kinds of physical interactions qualify as measurements" was referring to a different part of understanding QM. Decoherence doesn't explain which outcome, it does explains "part" of (or place constraints upon) the mechanism of measurement process.
So Decoherence only explains those things after you already have the Born rule (P ∝ |Ψ|^2). But that's the very thing we are trying to explain!
You don't need to dive into quantum mechanics to understand this concept; instead, consider being cloned twice while you are asleep, then killing the original.
Your current self knows what will happen; you'll fall asleep and wake up as either one or the other, becoming both but never being both.
> One does not deduce them like one do in math. The physicists who actually did this stuff apparently knew this, and considered the math more akin to toying with the models to see what happens, to see if they could get better models
Which is absolutely true. Part of the problem with quantum mechanics texts is that they're almost all descended from Oppenheimer's lectures (via Schiff's book) to graduate students who already knew this, and just needed someone to brain dump the latest techniques.
But then it's mixed with really basic misunderstandings, like
> Many physicists like to believe that this makes the underlying model irrelevant; that the matrix behaviour, its eigenvalues, is the only thing that matter. You will encounter lots of this in books about Quantum Mechanics. This however is not science, because it ignores Ockhams Razor; the models shall be the simplest ones. A sparse matrix is simpler than when it is Fourier transformed, or put into atom orbitals. (I thank Eliezer Yudkowski who gave a reminder that Ockhams razor belongs here too.)
Quoting Eliezer Yudkowski is a useful heuristic for not taking someone seriously, but this quote implies that the author missed the whole point of linear algebra. And is falling into the traps described by Theorem IV in van Kampen's [Ten Theorems about Quantum Mechanical Measurements](http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=617...).
He probably wrote this half in jest, but it is actually a serious issue in fields like Natural Language Processing and Computer Vision that are have their intellectual roots in Computer Science.
Papers in CV and NLP are about data structures, algorithms, software engineering methods, neural networks, statistical models, and so on. They do not discuss anything in particular about the structure of language or the properties of images. This is because they are descended from CS. In CS, exemplar results are ideas like QuickSort and Dijkstra's algorithm. These algorithms work on any list or graph; you do not have to know anything special about the particular properties of the list or graph you are operating on.
As an illustrative anecdote, I went to an NLP talk given at MIT by a well-known Google/Berkeley parsing researcher. He gave a talk about a system that used neural networks to do sentence parsing. In the QA session, I mentioned the idea of verb argument structure, and asked how the system would learn, for example, that a verb like "persuade" or "argue" can take a that-complement, while other verbs cannot. He didn't really have an answer, because it wasn't the kind of thing that he worried or thought about. My guess is that he did not consider such a question to be relevant to his field.
I don't know much about NLP but if I had to guess the answer to your question is include it in the training data. I don't know what model they use, but presumably it parses out sentence structure in a specific format and uses that as input to some neural network architecture.
Not really sure why he didn't answer your question, it's the exact kind of question people do NLP to answer.
Occam's razor is not a law. It's not a fact. It's a simple suggestion if you're looking for a starting hypothesis.
Not sure which way to start to investigate a phenomenon? Pick the simplest one and verify that one. It doesn't mean it's right, it doesn't mean it's wrong, just that it's a reasonable first guess.
But not a proof. Not a fact. Just a guess that's statically more likely to be right.
Ockham's razor, when formalized as in Solomonoff Induction, suggests preferring theories with the lowest Kolmogorov complexity.
While it can be used to select priority for exploring hypotheses, it's at least as valuable as a method of choosing from among models with equal explanatory power once they've been tested and established to have equal explanatory power.
 Picturing Quantum Processes, ISBN 1108107710
How water waves are different from sound? Water wave are sound, aren't they?
The waves that you see in water are the surface representation of 3-d movements under the surface of the water. So you get effects such as the depth of the movement for a wave determines its velocity. A wave that moves a very deep water column moves very fast. (One that is a half-mile deep can move as fast as a jet!) A wave that moves a shallow water column moves slowly.
Another interesting fact about waves is that there is a significant nonlinear interaction between the depth of the water and the depth of the wave. As you come to shore this causes the wave to rise up. Surfers enjoy this effect when it comes to normal wind driven waves. But in the case of very deep and fast waves, the effect is very much like a tide unexpectedly coming in. The result is known as a tsunami or tidal wave.
These complex behaviors mean that water waves can behave very differently from light and sound.
As a programmer who took quantum mechanics in college, I have to admit, I didn't understand most of it. Someone I respect once told me that if you think you understand quantum mechanics, then you don't understand any of it at all.
The mathematical portion, vector mathematics with imaginary numbers, was the only part that was interesting to me. It seemed to me that mathematical deductions were necessary because the phenomenon modeled nothing in our "reality."
Quantum Mechanics requires randomness, because determinism is scary.
Both probability and fate are functions of time, and time is the most interesting thing to look at.
Generally, Time is ignored, or at best "accounted for".
As a programmer I think time is more interesting than particles or fields or probabilities of wave-function whatever
Isn't the many-worlds interpretation generally regarded as deterministic?
I interpreted "because determinism is scary" as a tongue-in-cheek aside. Either way, time is not ignored: there is serious research effort into the concept of time in quantum mechanics.
The fact that simple arrangements of symbols on a screen can perfectly describe this behavior is mind blowing. It leads me to think there's no possible way we're not living in some type of computer simulation.
See https://physics.stackexchange.com/questions/9720/does-the-pl... (the given answer is pretty great, but beware, the author is known for being a bit hostile in his non-physics opinions)
I find this line of reasoning inadequate. In the 19th century, would you not be compelled to believe that the universe was made of mechanical pulleys, levers, and pipes because the universe operated with the perfection of a well-designed machine?
OK, so there are some phenomena that correlate (bear a resemblance to) with how computers behave. well, there are phenomena that correlate with a great many number of things; we shouldn't be surprised that computers and information theory are one of them
I think the article is a bit misleading about this. We don't know if space and time are discrete or continuous. As far as we know space and time are continuous and all the current theories treat them as continuous values.
Discretization of space and time is useful for numerical simulations, if you pick the grid small enough, but it's only a trick to do the calculations in the computer.
Anyway, it is possible that someday in the future we will discover that space-time is discrete, and it has a really tiny grid, and it has a tiny effect in the calculations. But no one is sure if this will happen in the future or not.
From what I can follow, most of it is, unfortunately either varying degrees of wrong or just confusing as all hell. :\
Some assorted quotes:
>The electron is not in a single place, but instead spread out over all the positions, more or less. This is called "superposition".
No, the superposition principal states that individual states (contributions) can be summed in a linear fashion. I know this is not a very clear way of wording it, but it is one of those things I think is quite hard to word but very easy to understand once you see it.
(Although one could say this is a form of superposition, namely a sum of infinite delta functions in position-space. But this would be the most confusing example to use)
>The smart programmer would guess at a model containing more complex math that will get the array to model several particles, but no such thing exists.
Quantum field theory.
> These examples have used cubes with a width of 1000 voxels, and of 1000 time instants. The Universe use a width of something like 10^70 voxels. The same goes for time.
I assume this is (roughly matches) size-of-observable-universe/plank-length. But this is misinterpretation of the planck length. As far as we (I) know, spacetime is continouis. (string theorist migth disagree, I am not familiar).
Furthermore, the many particle approach is not at all reasonable. One would (usually) use lattice-QFT. where one simulates a field for each type of particle (the field can have several components) and particles are identified as excitation of this field. The most well known is lattice-QCD.
>Quantum Electro Dynamics
Quantum Electrodynamics ;)
>Richard Feynman got the nobel prize for figuring out a way of doing this. His Quantum Electro Dynamics is a sort of dynamic programming method
This part is right (I have no idea what the next few sentences are trying to say). Feyman diagrams (those fun squilly drawings) represent the results of some awful, awful integrals. The real analytical answer is integrated over two infinite spaces. But one can do a taylor expansion to get a answer which can be computed. Feynman noted that you can read off a few rules from this approximation and assign drawings to them. The answer can then be computed by summing all (topologically distsinct graphs) instead of the terms in the Taylor sum. It's much easier than it sounds (and sure as hell more fun than doing integrals), and you can copy-paste entire sections of your diagrams as long as the in and out-puts match up. perfctly suitable for dynamic programming.