Hacker News new | past | comments | ask | show | jobs | submit login
Poincaré Recurrence Theorem (wikipedia.org)
57 points by duck 5 months ago | hide | past | favorite | 30 comments



A measure-theoretic version of this theorem was used by H. Furstenberg in 1977 to give an elegant proof of Szemeredi's theorem: if you have a sequence of integers of positive density, then it contains finite arithmetic progressions of all sizes. This is an example of Ramsey's principle [size => structure] of inevitable structure. The original proof by Szemeredi was combinatorial, and Furstenberg was among the first to use ergodic-theoretic methods to get theorems in arithmetic.


Hilel Furstenberg is fond of arithmetic progressions. Used them in his topological proof that there are infinitely many primes.

https://en.wikipedia.org/wiki/Furstenberg%27s_proof_of_the_i...


This seems at odds with the 2nd law of thermodynamics. I suspect this dissonance is a clue that I don't know what "dynamical system" means in this context, and the physical world is not in fact a dynamical system


The theorem holds not for arbitrary dynamical systems but under two conditions: 1) The flow preserves volume 2) All orbits are bounded.

The second law of thermodynamics is in a sense a statement about evolution of probability distributions. As time goes on the dynamical system mixes any probability distribution such that entropy increases. Poincare recurrence is a part of this mixing phenomenon.


Your first requirement is part of the definition of a dynamical system. It is also satisfied for the evolution of the phase-space for any system in classical mechanics.

The second requirement is usually the case for systems with finite energy.

So as far as classical mechanics is concerned the Poincaré Recurrence Theorem pretty much always applies.


Dynamical systems definitely don't have to preserve volume!


Then your definition must be different from mine.

Which is possible. I mostly encountered them in the context of Ergodic Theory, in which case a preserved measure is very much non-optional.


There is a tension between them. If you take the second law as 'rigorous', the entropy must always increase. But Poincare recurrence says you will inevitably return to the lower entropy state, which of course is a violation.

One way around this is to relax the definition of the second law to be something like "almost always": the set of states with higher entropy are incredibly unlikely. And when you compute the recurrence time, it's astoundingly large; like trillions upon trillions of ages of the universe.

Cosmologically, there is some escape in that the expansion of the universe breaks the assumption of finite accessible phase space, but there is an underlying difficulty of what it means for the universe to be truly infinite or divergent in size. Not just really big, but truly infinite: where did it all come from?

My personal take is that the second law is about "macroscopic" experiments. You are given a box and you can only do things like push the side of box or put it on a hot stove or in a magnetic field or whatever, and the laws are about what you can or cannot do with those operations. The microscopic system might decide to do something incredibly unlikely, but you can't control or even anticipate it, and that is just the extremely long tail of the probability distribution that is a thermodynamic state.

I don't claim my view is rigorous; people like Boltzmann and Ehrenfest were much smarter than me and struggled with what irreversibilty means, and you can kind of go crazy worrying about this (likewise with the 'measurement problem' of quantum mechanics). In the end, whether a situation can be mapped to the exact axioms of a model like QM or thermodynamics is very tricky and perhaps unknowable.


Your personal one is the correct one from my perspective (as someone that knows a lot of math and is familiar with statistical mechanics as math and not physics). The ODEs that we get from classical mechanics are typically reversible: we can write down an ODE that does the same thing but backwards.

You cannot do that for the PDEs that arise in statistical mechanics and the result is the second law. These PDEs arise from approximating many copies of deterministic systems as continuous distributions of states. Entropy is not a concept that makes sense when discussing single trajectories of systems — only the macroscopic view of many copies of that system evolving according to the same dynamics.


I wrote about this at https://r6.ca/blog/20100802T005835Z.html.

Basically entropy is a function of descriptions of state rather than states themselves. So when you talk about jar of gas having a certain volume and pressure, there is a set of states that satisfy that description and the logarithm of the volume of that set of states (in phase space) is the entropy of that description.

Liouville’s theorem says that the volume of a subset of phase space is preserved under time evolution of the system. So if you want to have a machine that reliably transform from one description to another description, then the volume of the latter description must be at least as large as the initial description. That is the second law of thermodynamics.

(This are a bit more complicated than I state above because you have to work with differentials, but the basic idea is there).

The Poincaré recurrence theorem says individual states eventually almost reoccur, but this doesn't apply to a volume of states. The volume of states usually gets smeared out throughout phase space.

Like if you start the Lorenz equations on a square of points instead of an individual point, it gets smeared out over the entire Lorenz attractor. And while individual points in that square nearly reoccur, the entire square does not. And thus you cannot build a machine to reliably reduce entropy because you cannot predict when your reoccurance will reoccur (even if you can compute an upper bound on how long it will take) since you never know precisely the initial conditions to begin with (you start somewhere in that initial square).


Except the recurrence theorem applies to subsets of the space, not just elements. Your "doesn’t apply to a volume of states" isn't really an exception, apart from technicalities about subsets of measure zero.


Do you have a reference on this?

I don't think the space of subsets is going to be bounded, which means the Poincaré recurrence theorem won't apply to it.

That coupled with the fact that starting our Lorenz equation on a box shaped subset will cause that box to stretch and distort, asymptotically approaching the Lorenz attractor. This implies that it can never become box shaped again. The set just gets closer and closer to the shape of the attractor as it evolves.

Otherwise it wouldn't really be an attractor now would it?

P.S. If you take a discrete subset, like pixels on the screen, then I agree those sets of pixels will reoccur, making it look like the box has reappeared. But in reality that is just no longer a representative sample of the twisted form of the actual set wrapped around the attractor.


I'm not sure exactly what you mean by "space of subsets".

All I was trying to point out was that the argument that proves the recurrence theorem itself uses a volume of space around the initial state, and how "preimages" of that volume work. So it applies to volumes of states, too.

The Lorenz attractor generally avoids the recurrence because its dynamics are dissipative: nothing drives points near the attractor to points far from the attractor.

But once you are on the attractor, you can't just stay on the attractor forever getting "smeared out" without recurrence: you can only get at most smeared out over the finite area of the attractor, and eventually the smearing reaches your initial location on the attractor again.


You are correct about the Lorenz attractor being dissipative. I just assumed the chaos of the Lorenz attractor and, say a frictionless double pendulum, were the same phenomenon. However it seems they are, in fact, quite different.

Even still

> So it applies to volumes of states, too.

I'm pretty sure this is false. As I mention in another comment, http://philsci-archive.pitt.edu/9838/1/recurrence.pdf

says

"In [the case of the space of probability distributions over phase space], the reason that classical dynamics fails to abide by the linear recurrence theorem is that the distribution space is infinite-dimensional, even if phase space has finite volume: distributions can have structure on arbitrarily short scales.

Yes the proof of the recurrence theorem uses volumes; but the result still only applies to individual points within the volume (or maybe sets of measure 0 at best).


Replying to myself:

Apparently strange attractors can only occur in dampened systems: https://physics.stackexchange.com/questions/125931/does-a-si...

So my analogy between the Lorenz attractor and phase-space chaos (like that of a double pendulum) is false.

However, I think my point is still true:

http://philsci-archive.pitt.edu/9838/1/recurrence.pdf

says

"In [the case of the space of probability distributions over phase space], the reason that classical dynamics fails to abide by the linear recurrence theorem is that the distribution space is infinite-dimensional, even if phase space has finite volume: distributions can have structure on arbitrarily short scales.


The simple intuition of the Poincaré Recurrence is this.

- Imagine a closed box, and this box has 100 particles.

- The particles in the box can be arranged in all sorts of ways due to what ever force (gravity, Brownian motion etc).

- They (the particles) can be clustered at the centre of the box, cluster and any of the corners of the box, or spread around the box in all sorts of ways.

- When a system i.e this box, change the way it's particles are arranged. It is referred to as dynamic.

- And all the different ways the particles can arrange themselves are referred to as "states".

Now, is the world a "dynamical system"? No.

Dynamical systems are just an instrument of abstraction used to this about such systems.

Seeing 3 oranges doesn't mean that the group oranges are made up of a number 3 in reality.

Just like how modeling planets with infinitesimal calculus does mean the planets work at the behest of calculus.


The 2nd law doesn't say that a decrease in entropy is not possible; it just says it's extraordinarily unlikely. But given sufficient time extraordinarily unlikely things will happen


Entropy in Statistical Mechanics is a quantity associated with a probability distribution over states of the system. In classical mechanics, this is a probability distribution over the phase space of the system.

Two probability distributions with different entropy can both assign finite probability density to the same state, so an increase in entropy does not preclude the possibility of the system returning to its initial state.

A great deal of confusion about entropy arises from imagining it as a function of the microstate of a system (in classical mechanics, a point in phase space) when it is actually a function of a probability distribution over possible states of a system.

A further wrinkle: Liouville's Theorem [0] shows that evolution under classical mechanics is _entropy preserving_ (because the evolution preserves local phase space density, and entropy is a function of this density). An analogous result applies to quantum mechanics. However, a simple probability distribution parametrized by a few macroscopic parameters rapidly becomes very complex as it evolves in time. When we imagine the entropy of an isolated classical system increasing over time, the meaning is that if we want to model the (very complicated) evolved probability distribution with a simple probability distribution (describable in terms of a few macroscopic parameters), the simple distribution must have entropy greater than or equal to the complex evolved distribution, which is equal to the original entropy before evolution.

It's difficult to reconcile the idea that entropy is a function of a probability distribution (not a function of a system's microstate) with the idea that Thermodynamical entropy is an experimentally measurable (kind of...) property of a system. Jaynes' "The Evolution of Carnot's Principle" [1] is the clearest description I've seen of the relationship between Thermodynamic entropy and Statistical Mechanical/Information Theoretical entropy. Many of Jaynes' other papers [2] on this topic are also illuminating.

[0] https://en.wikipedia.org/wiki/Liouville's_theorem_(Hamiltoni...

[1] https://bayes.wustl.edu/etj/articles/ccarnot.pdf

[2] https://bayes.wustl.edu/etj/node1.html


Not necessarily. I suspect the Poincare recurrence Theorem provides us with enormous durations, like 10^(number of dynamical variables), with the unit of time omitted because of utter irrelevance, or something even more horrific. It may be the case that the 2nd law of thermodynamics does not hold on such time scales but how could we ever know?


It doesn't really provide you with durations.

The handwavy proof goes as follows(The proof of this theorem is extremely simple in measure theory language, it's actually a quite common exam question in bachelor's level measure theory courses):

You take the set of items that you think never return to their original position.Call this set X. Now take the infinite sequence of pre-images of this set (i.e look at the items that map onto some part of X after one "evolution" of the volume preserving flow, then look at the items that map onto some part of X after 2 flows, etc), for natural numbers.

Note that a key part of the measure theoretic version is that you can assign a finite "volume" to the overall system and some subsets of the system.

This sequence (of pre-images under the volume preserving flow) has items that are pairwise disjoint (i.e an item that flows onto X after 1 evolution cannot also be an item that flows onto X after n evolutions). This is obvious from definition of X(as otherwise that item would recur).

Now here is the tricky part(but basic result in measure theory): The sum of the "volumes" of these sets (of pre-images) is equal to the "volume" of set Y, where Y is this infinite union of pre-image sets together(with union). Because this big set Y is some subset of your system, clearly it has a finite "volume"

This means the "volume" of your non-recurrent points has to be 0. Why? You assume the flow is volume preserving, which means that all the terms in your pre-image sequence have the same volume. Thus each volume has to be 0 (as you have an infinite sequence summing to a finite number).


In practice, for quantum mechanical systems, you get astronomical durations once the systems get even slightly complicated. It's possible to estimate and bound the recurrence time using the distribution of energy levels.

Even the simplest chemical systems, like a single 10 atom molecule in vacuum, can have recurrences of years. A glass of water would eclipse the age of the universe, etc.


>> return to a state arbitrarily close to (for continuous state systems)

I'm glad it qualifies it with this. The Lorenz attractor seems like it will never return to the same state since trajectories seem to fold in between others, but it will pass arbitrarily close to a point again.

https://en.wikipedia.org/wiki/Lorenz_system


Also, Nietzsche's eternal recurrence, written in 1888, although published much later. While its physical basis is similar, its relevance in his context is on how to evaluate and live your life.




Well, if Hacker News is a dynamical system, and discussion of the Poincaré Recurrence Theorem is one of its states, it was bound to come up again eventually.


That's funny, I was just about to write the same comment!


Doesn't that only happen in certain dimensionality < 3 ? See also random walks in higher dimensions


You may be thinking of Polya's result, which lets the walker go anywhere on the grid, so there are more possible states than Poincaré allows. For a discrete system, Poincaré assumes (roughly) that the space of possibilities is finite.


Poincaré recurrence is not about random walks though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: