This paper is basically statistical mechanics with a quantum veneer. Two major issues:
1. Scale: They're simulating just 13 qubits with QuTiP and making grand claims about quantum thermodynamics. The computational complexity they're glossing over here is astronomical. Anyone who's actually worked with quantum systems knows you can't just handwave away the scaling problems.
2. Measurement Problem: Their whole argument about instantaneous vs time-averaged measurements is just repackaging the quantum measurement problem without actually solving anything. They're doing the same philosophical shell game that every "breakthrough" quantum paper does by moving around where they put the observer and pretending they've discovered something profound.
1. The main underpinning of this article is the analytical theory they come up with independent of their simulation. The fact that it explains a few qubits well is exactly why this is interesting. If you were to scale up their model - a spin-1/2 ising model, you would effectively get a classical magnet, which is obviously well described by classical thermodynamics. It's in limit of small systems that quantum mechanics makes thermodynamics tricky.
2. Their time averaging is just to remove fluctuations in the state, not avoid the measurement problem. They're looking at time averages of the density matrix, which still yields a quantum object that will collapse upon measurement. And as their mathematical model points out, this can be true for arbitrary time averaging windows, the limits just change respectively as smaller time averages allow for larger fluctuations. There's nothing being swept under the rug here.
As long as they are isolated, their state is a superposition of all possible states, and evolves determinsitically, with the amplitude of each of these "sub-states" evolving perfectly determinsitically. If you want to perform a measurement, you choose a possible decomposition of the superposition state and measure along that axis, and you'll get one of the values along that axis, with a probability that is the modulus of the square of the (complex) amplitude of that value.
> Even for isolated systems where, in principle, we could have zero surprisal and access to all possible information...
This makes no sense. How could you have access to "all possible information" in an isolated system? You obviously can't make any measurements, and if the system is prepared, then it's entangled with the system used to prepare it and again cannot be isolated. The whole notion of "an isolated system" is a theoretical fiction that doesn't actually exist in physical reality, but even in theory one cannot access all of the information in an isolated system because of the no-cloning theorem. So this really feels to me like the old joke about spherical chickens.
Furthermore, this seems like an already-solved problem. Constructing classical reality requires copying classical information, and the only way to make that happen is to discard quantum information [1]. That is the source of the Second Law and the arrow of time [2].
As you know, all possible information for an isolated system is obtained via solutions to the Schrodinger equation. This is standard many-body physics.
Well, yeah, but that seems like a vacuous observation to me. In order to find solutions to the SE you have to know the initial conditions. How are you going to obtain those for an isolated system? You haven't solved the problem, you have just pushed it backwards in time.
> This implies that for macroscopic systems, the expected time one would be required to wait to observe such a decrease in entropy occurring is unobservably large.
Yea but we have virtual particles and the Casimir effect. Am I wrong or isn't this these perturbations evidencing themselves on a macroscopic scale?
Perturbations can mean either analogical reasoning (something is similar to something that it could come from with a small change) or actual perturbation (the effect of Venus on the orbit of the moon). Virtual particles are perturbations in the former sense, while quantum fluctuations are a small perturbation in the latter.
The comments on HN are always full of ostensibly deep quips or questions but the work to connect them to scientific or philosophical questions is absent. That's the part that doesn't scale.
So true, tale as old as time. Someone “raises doubts” based on partial knowledge of the subject, they go back and forth with someone, and then finally someone comes in with conversation-killing “what is consciousness anyways” type comment
1. Scale: They're simulating just 13 qubits with QuTiP and making grand claims about quantum thermodynamics. The computational complexity they're glossing over here is astronomical. Anyone who's actually worked with quantum systems knows you can't just handwave away the scaling problems.
2. Measurement Problem: Their whole argument about instantaneous vs time-averaged measurements is just repackaging the quantum measurement problem without actually solving anything. They're doing the same philosophical shell game that every "breakthrough" quantum paper does by moving around where they put the observer and pretending they've discovered something profound.