I went to a talk of his on CCC ages ago, and it was such a fascinating combination of geometry, causality, and asymptotics. I have absolutely no clue whether it's reasonable physically, but independent of that, it's just a really elegant fusion of topics in a fun to think about way. Worth a read for anyone who just appreciates elegant new ways of combining mathematical structures.
I've also seen this talk, at the behest of some spaced out friends of mine, an amazing experience and I still think about the universe through the lens of that talk!
My understanding of this idea is that once the universe reaches a state of maximum entropy (this is the “heath death” of the universe, where everything is a uniform, undifferentiated cloud of photons, then time stops being meaningful because there can be no change from moment to moment. In a sense, time _is_ the change from low to high entropy - if you don’t have any entropy gradient, you can’t have any time either.
I've always rejected the idea that time is entropy change.
First, in many local processes entropy moves from high to low (e.g. life). Nobody says that time is moving backwards for living things. It only increases if you consider the system it is embedded in as well. So this idea that entropy is time is something that only applies to the entire universe?
It's true that we don't see eggs unbreaking, or broken coffee cups flying off the floor and reassembling. This increase in entropy seems to give an "arrow" of time, but to my mind this view (ironically) confuses cause with effect.
If you have any causal system (cause preceding effects) then you will always see this type of entropic increase, by simple statistics. There are just many, many more ways for things to be scrambled and high entropy than ordered and low entropy.
So yes, entropy does tend to increase over time, but that's an effect of being in a causal system, not the system itself. At least, that's my view.
Could you expand on your comment that life has entropy moving from high to low? Doesn't aging increase the entropy in our biological system? I have always thought that we are at our most structured in the early phases of conception with entropy increasing constantly as we age.
Life is essentially a process of creating order (lower entropy) building complex cells and so on using energy and matter from its environment.
Perfectly true that entropy gets us in the end as we age, as the system breaks down and cannot sustain itself any longer. Although if we could fix those systems, there's no reason in principle we couldn't halt aging entirely.
I took it as capital-L Life is moving from high to low. As evolution continues Life seems to evolve ever higher -> lower/more-ordered organisms (as more complex organisms depend on the systems created by simpler organisms prior to themselves).
I am slightly blending the concept of entropy and complexity. But "ordered complexity" is how I imagine it.
I don’t think entropy ever moves from high to low overall, it only ever distills some local low out of an higher entropy area, and in doing so, the overall entropy increases.
It works a bit like air conditioning: yeah, you can make one room cold, but only by making more heat outside the room. The overall temperature of the system increases.
This sounds sort of like the "if a tree falls in a forest and no one hears it, did it make a sound".
if time passes and there's no observable difference, did it pass? I guess it makes no meaningful difference, but it's not really answering the underlying question of if some variable is advancing or not.
If nobody logs in to a multiplayer game, does the game world still exist?
Sure there are files sitting on a server somewhere waiting to be read when the first user logs in, there may even be a physics engine polling abstract data structures for updates, but the game world doesn't render without players present with their computers bringing all this data into a coherent structure.
Also, for an extra existential kick, realize that it renders /independently/ in the GPU/CPU/RAM of each player's computer.
I remember the book "Now - Physics of Time" by Richard Muller (a Berkley physics professor) touching on the subject of entropy linked to time, but I never got to finish the book and sadly I can't provide more insight.
And potentially leads to things like Boltzmann Brains, given enough time! Quantum fluctuations can still create wildly improbable things, even if only briefly.
If everything is massless, everything travels at the speed of light, and nothing experiences any time (photons travel null geodesics with zero spacetime interval).
This is required to make Penrose's end state Conformal i.e. scale invariant, so that it can arbitrarily Cycle to a small scale to make a new Bing Bang Cosmology (CCC).