Just because the math is pretty doesn't mean the universe actually works that way. Maybe it does, maybe it doesn't. Sure seems like the universe doesn't seem to care much about our math.
I think the authors of the paper are being perfectly straightforward (e.g., they explicitly include the phrase "in a certain very limited sense" RE: time proceeding in discrete time steps), but the title of the article itself feels a bit misleading.
Taken to the extreme, the best description of the world is a long list of all events and facts. No beauty in that. Now science tries to compress this list of events to a simple model that explains it best. Like in statistics, one should aim for the simplest model predicting most of the data.
I'd argue that if you think of science as an attempt to compress all the data in the world, then OP's statement is obvious, and at the same time there's good reason to value beautiful (i.e. short & simple) explanations higher provided they generalize well.
Well, some mathematical truths are so universal even the universe can't escape them. the incompleteness theorems  come to mind.
Regarding the article, I think most physicists consider the continuous models they use to attack the problems at hand a hack to make the mathematics consumable. I don't think they consider the universe itself to be non-discrete.
Interesting thought ... is mathematics really orthogonal to the universe or does it "result from" or fall out as a result of the universe within which it was described?
Getting philosophical, I know, but if you follow, then perhaps the degree to which you don't have to twist mathematics to describe the universe might suggest you are in fact on the right path....
Getting more mtr complicate not meaning they are right.
Movement and measure
How to discern life through time
Everything is change
Time and the inherent math of the universe is amazing to think upon, especially when you look back upon the wisdom of ancient humans who were able to perceive cycles of the stars that were many thousands of years long whilst human conscious life is so fleetingly short.
Its interesting that we do not try to ask HOW civilizations past could have been aware of cycles that last for tens of thousands of years.
And now humans are so myopic on what is in front of them, we hardly look up.
That’s probably because they weren’t. What “cycles of the stars that were many thousands of years long“ were ancient peoples aware of?
Noting a long-term cycle of equinox precession is impressive but also not terribly surprising. If you can note the change, you can reasonably assume it's cyclical (because the alternative is that it's going to suddenly stop moving at some point in the future).
Their calendrical system did not attempt to correct for leap years, but we do know that they knew the length of the year to roughly the same precision as the Gregorian approximation.
Tiring is the limiting of mind and creativity such that one thinks that science and reflection are not, intrinsically, the same thing.
Is this true? I didn't think there was any kind of consensus on this topic. I also thought some scientists believe the Planck time(/length) is the smallest unit of possible time i.e. they would say time is discrete
A bigger way to think about it is ... is the a universe like a movie where each still image lasts a Planck-time (or some other smaller chunk)? Is the entire state of the universe encoded in stasis for that instant, and then boom ... it's time for the next "Planck frame".
It's really hard to resolve that kind of universal time chunkiness with the ordinary time dilation effects that can be observed at relativistic energies. Whose frame of reference counts? We do know that space-time can be warped ... it's harder to imagine that this kind of warping and curving is discrete. That's one reason why many believe it's a smooth flow.
To say that time is "really" continuous under such circumstances seems like a weird philosophical claim.
And inversely, science is not just measuring -- that's the naive positivist (and later Popperian) view of what scientists do.
As Thomas Kuhn (1961) argues, scientific theories are usually accepted long before quantitative methods for testing them become available. The reliability of newly introduced measurement methods is typically tested against the predictions of the theory rather than the other way around. In Kuhn’s words, “The road from scientific law to scientific measurement can rarely be traveled in the reverse direction” (1961: 189). For example, Dalton’s Law, which states that the weights of elements in a chemical compound are related to each other in whole-number proportions, initially conflicted with some of the best known measurements of such proportions. It is only by assuming Dalton’s Law that subsequent experimental chemists were able to correct and improve their measurement techniques (1961: 173).
It's possible the Uncertainty Principle is wrong. One can come up with theories that it is wrong. And what sorts of observations might hypothetically observe or refute them. As far as I know, there is no serious science doubting HUP at present. (Which doesn't prove it's right, it's just the current state of things).
But simply speculating about what things might exist that current scientific understanding says scientific laws would prevent us from observing or measuring or seeing any effects of whatsoever -- is still not science. It's not that you can't have a scientific theory which current measurements seem to refute and still press for it (your Dalton's law example). It's that you can't have a scientific theory with no even theoretical way to refute or confirm it by observation. That's not a scientific theory.
Ok, enlighten me. I've read philosophical works where that certainly seemed to be the case. There's also this weird thing where only a specific lineage of schools of thought and certain ancient Western philosophers are "real philosophy" and others (e.g. anyone with a non-Anglo last name) are "not real philosophers". No True Scotsman at its finest.
>And inversely, science is not just measuring -- that's the naive positivist (and later Popperian) view of what scientists do.
In graduate school, I worked at a lab in Stanford (I didn't attend there, but was fortunate to have spent some time there). Your statement would certainly be news to the PI I worked for, who was a heavy Popperian. You can't falsify the existence of sub-Planck lengths or times, that's for sure. Science was absolutely nothing more than that to him. And while you can certainly claim an Appeal to Authority fallacy, Stanford, at least, considered my PI competent enough to award him a PhD in a scientific discipline, so I'm not sure how much I can agree with your post.
This is not to suggest that we should not inquire on these topics, but rather that they should fall under some other umbrella besides Science.
 Or whatever is left of it at this point.
It's that science itself as a process is not Popperian (and of course modern epistemology has moved from the times of Popper).
Actually what you say is the weird philosophical claim.
It's like saying Australia didn't really exist when we couldn't observe it (or see any effects from it reach us).
The thing is, to say it could be continuous even if that's not observable might be wrong. But not because it can't be both continuous and not observable: just because it can also be discreet and not observable, and we can't tell which of two it is for sure.
Whether we (or any observer imaginable) can observe effects or not, doesn't mean it's not one or the other.
This is a straw man, and I think you know that. It's more like saying Atlantis doesn't exist, and, well, to the best of our knowledge - it doesn't. I'd be happy to re-evaluate upon receiving evidence that Atlantis may have existed, just as QM physicists should be happy to re-evaluate when sub-Planck-scale phenomena can possibly be observed using a theory which is also consistent with the rest of our observations of the world.
Oh and an even better analogy than Atlantis - it's like someone saying Australia didn't really exist when we couldn't observe it, and claiming they would prove it by sailing from Chile to Africa without encountering any large land masses.
So no, he's not making any weird philosophical claim at all.
I don't see much difference. If you're "happy to re-evaluate" (e.g. open to the possibility) then the answer (even re: Atlantis") is "I don't know if it exists", not "doesn't exist".
More than that. It simply doesn't exist in any meaningful sense and it's a waste of time to even think about it.
What cannot be observed in principle has exactly zero impact on the universe. That's pretty much a definiton of "not existing".
The irony is it's actually a philosophy (a philosophy of science among several in particular) saying the above.
It's not anything scientifically observed itself that says that when something can't be measured "it simply doesn't exist in any meaningful sense and it's a waste of time to even think about it".
EDIT: looking at this "axiom" from another angle. Imagine that I claim that atoms are really built out of Zeta faces, that are exactly like quarks, but have smily faces etched on them. I also claim that it's in principle impossible to tell a difference between a Zeta face and a quark (hypothetical quark, after all it's all Zeta faces). They just are there. At this point you could correctly observe that since the universe would look precisely no different whether or not there are smiley faces on quarks - there's not a bit of information distinguishing these two hypotheses - my claim is absolutely meaningless and a waste of time. This generalizes to all kinds of invisible, unmeasurable magic faries under the bed, etc.
(It's probably a well-trodden philosophical ground, but I don't know who to read to find a decent discussion of it.)
There are several logical holes in the axiom.
The first big one is that it implies logic is applicable to the universe, i.e. that the universe must abide by logical necessities (and even more so, by logical necessities the way we perceive them). Setting aside the fact that we cannot prove logic's applicability logically (it's itself an axiom), QM has given us many non-classical-logic compliant observations.
Then, there's the assumption that: "it exists <=> something would be different about the world if it didn't exist". That's again an assumption. I could just as well consider that (the fact of its existence turning from false to true aside) the world could be the same whether something exists or not.
This presupposes when it should instead prove: that existence necessitates difference in the world.
Third, "it's in principle observable (though its impact) <=> can be studied by science" -- whether something can be studied by science or not shouldn't be a criterion for its existence.
Black holes existed for billions of years before they could be "studied by science", or before science even had any understanding of them. Only in retrospect we say that they "can be studied by science". Did the not exist when we couldn't say that, and exist now?
Or they did exist and we only lately came to find out about them, and that we can actually study them? If it's the latter, why wouldn't that -- the possibility of existence- be the case for tons of others things (e.g. sub-plank length stuff)? Because our current theories say so? Or because we thing our current theories are the be-all end-all we'll come up with?
Fourth, one could argue that something could still exist, have impact, and
not be observable. E.g. if it's impossible to make it not exist, or look at that level (so that we have no way to measure its impact). E.g. some entities or energies at the sub-Plank scale level could hold the universe together and be essential substrate. But we just have no way to know about it, because we can't look at that level (and our theories can't even go at that level).
>At this point you could correctly observe that since the universe would look precisely no different whether or not there are smiley faces on quarks - there's not a bit of information distinguishing these two hypotheses - my claim is absolutely meaningless and a waste of time.
The claim would be a waste of time to try to verify.
It could still very well be ontologically true -- which is what we're discussing.
That something that can't be verified in any way doesn't exist is a logical jump. What we can say for certain is that it could or could not exist, but we can't verify it.
We can't even say that it has no impact to us, because without being able to verify it, how can we be sure it has no impact? At best we can say "it has no impact as far as we can tell".
But it could still have a huge impact -- the same way the Quarks you've mentioned have a huge impact, had a huge impact even before we knew existed and before we could measure them, and would have have a huge impact even if we remained forever at the 15th century level of physical knowledge.
> that the universe must abide by logical necessities (and even more so, by logical necessities the way we perceive them)
This axiom can be defined better, to account for the fact that we only perceive the universe through our minds, and this opens a whole can of philosophical worms - but in practice, this axiom does hold, and if it didn't, we could only pack our bags and go back to caves, as nothing would make sense and there would be no point to thinking.
(I didn't dig too deep in that can of worms, but I vaguely suspect the justification for such axiom could be found in anthropic reasoning.)
> QM has given us many non-classical-logic compliant observations.
Well, classical logic is pretty limited and it isn't be-all, end-all of logics, and not exactly what we use in daily reasoning anyway. I admit to not having much knowledge on the more complex types of logics, but the problems I recall seeing boil down to misapplying logic as an abstraction. For instance, I just stumbled upon the following example of "classical" vs. "resource-aware" logic:
"suppose you're standing at a vending machine that dispenses candy for a dollar and also dispenses coke for a dollar. You might write that as 1$⇒candy and 1$⇒coke. But then inferring that 1$⇒candy&coke is clearly wrong. The dollar gets "used up".". The post introduces resource-aware logic as a solution, but to me the problem seems to be modelling vending machines as (classical) logical implications; it's kind of dumb if you think about it.
(I'm also aware that probabilistic models are a better way to describe the observed world in practice than formal logic.)
> Then, there's the assumption that: "it exists <=> something would be different about the world if it didn't exist". That's again an assumption. I could just as well consider that (the fact of its existence turning from false to true aside) the world could be the same whether something exists or not.
I don't see how it could be any other way. Without asserting that there is a non-zero diff between universe with something and universe without something, "existence" is an empty label without meaning.
> This presupposes when it should instead prove: that existence necessitates difference in the world.
"Existence" is a term we define, and again, a different definition seems meaningless.
> whether something can be studied by science or not shouldn't be a criterion for its existence.
I didn't mean that current science defines what exists or not. I meant that existence of something has observable impact on universe. Observable in principle - not right now by us, but theoretically observable with sufficient knowledge and technology level. And if something is observable like this, then it falls into domain of science - it could be studied, and conclusions could be drawn. My point is that the set of things that exist is equal to the set of things that are of interest to science. This is not a time or technology-dependent claim.
> Fourth, one could argue that something could still exist, have impact, and not be observable.
How? "Have impact" is synonymous to "in principle observable".
> E.g. some entities or energies at the sub-Plank scale level could hold the universe together and be essential substrate. But we just have no way to know about it, because we can't look at that level (and our theories can't even go at that level).
I think you could only say such entities exist in a meaningful sense if their non-existence would make the universe look different. This way you could indirectly infer their existence through considering the things you've observed and not observed. If the sub-structure of the universe was completely screened away, so that whatever happens there, no bit of information reaches our universe, then I see no meaning in the question of whether or not that sub-structure, or entities inhabiting it, exists.
Again, by "observable" I at not point meant "observable by us, today". I meant "observable in principle".
 - https://physics.stackexchange.com/a/279493
Lemme try to answer those, as this is the core of the current disagreement I think.
Consider the case of parallel universes -- they are actively studied by scientists as a notion, and there's very much a question of whether they exist, while still "no bit of information reaches our universe" from them.
Does that preclude their existence, or makes such an inquiry into the possibility of their existence meaningless? Several physicists and cosmologists (even major ones) don't seem to think so.
And that Euclidean geometry itself is only one possible geometry, but took us several millennia (until e.g. Lobachevsky) to find out -- it was taken as a kind of axiom itself, like absolute time etc.
The cat before you observe is ... you never know. Observe change it.
Qm - I assume you mean wave functions and their collapse - are an explanatory mathematical abstraction, but they follow rules grounded in observational data. You don't say "wave function collapsed there, only it's impossible to measure it"; that's meaningless. We say that the wave function collapses when we find some observation changing from one state to another in a way that implies there were superpositions of state involved in the middle.
Does it mean that time does NOT progress in discrete chunks, or can not? I don't believe so.
> It has since become clearer, however, that the uncertainty principle is inherent in the properties of all wave-like systems, and that it arises in quantum mechanics simply due to the matter wave nature of all quantum objects. Thus, the uncertainty principle actually states a fundamental property of quantum systems and is not a statement about the observational success of current technology. It must be emphasized that measurement does not mean only a process in which a physicist-observer takes part, but rather any interaction between classical and quantum objects regardless of any observer.[note 1]
The jury may still be out on that. A few of the quantum gravity theories (like loop quantum gravity, causal dynamical triangulation, or causal sets) propose a discrete spacetime as a consequence of background independence.
I'd still like to know what either "flows" or "chunks".
Time? Sure, ok, time flows, or time chunks. But what exactly is it?
Time has been one of my favorite topics to read about in the last 20 or so years, and I still feel like we have hardly made any progress in figuring it out.
It may be (probably is) that "what is this" or "why does it do that" are just meaningless questions, in the sense a child (or adult who hasn't re-calibrated their expectations of what science can tell them) means them. Which is interesting itself, I suppose, but at that point just go study philosophy and art, if that's why you were into science in the first place—and I think it's why everyone at least begins to be interested in science.
[EDIT] in particular it's frightening how fast kids' questions stymie an adult because the adult realizes that they're only a step or two away from "why is there something instead of nothing, and also what is something?" and... good lord, at that point seriously just go become religious or get really into philosophy or something, because you're outta luck, kid, sorry, we don't have any actual answers, just observations about what all this stuff does, but no we can't tell you what "stuff" or "does" are.
He also points out that, ever since Newton, the dream of mechanistic world was shattered, and with it, any hope that science would help us "make sense of the world" in a truly satisfying sense. Instead, we have become content with modelling the world mathematically and predicting how it will evolve.
Of course, space-time and quantum mechanics have driven additional nails in the coffin of a truly understandable world.
I'd think the opposite. The fact the world can work at least as strangely as relativity and quantum mechanics suggest, and we still figured those out as much as we have so far, reassures me in our ability to figure out the world we exist in.
However, we can still hope to model and predict it more accurately than we would have ever hoped before, so there is that :)
I myself lean to the Aristotelian definition: count of change.
After all, all our clocks are built on this definition. If there were a lone particle traveling in an infinite universe, it'd be nonsensical to talk about its speed because there'd be nothing to relate the distance traveled to (and no observer). Indeed, whenever we talk about "time" and related quantities like "speed", we always compare the observed object to some other, existing reference.
So I think of the interaction of particles/matter as being something fundamental, whereas "time" is a synthetic human notion.
Except! There would still be virtual particles creating and annihilating. Let's assume that there is an interval between these events (e.g., they aren't constantly present).
Would Aristotle say that no time passes between events? I would tend to say that time yet passes even during the periods when nothing else exists and nothing is happening.
Let's say the second to last things in the universe were another observer with a clock who could time and observe these virtual particle events, and this observer was able to time these events at a 1 second interval between events (duration of event is irrelevant).
And then that second-to-last observer and the clock suddenly annihilate somehow so that only your immutable particle remains. And the virtual particle events still happen at the same interval period.
Would the time interval between virtual particle events somehow be different without the clock/observer as with the sole immutable particle? That doesn't make sense to me, though I will certainly admit that it could very well be!
Aristotle's idea of time is not fully satisfactory to me, though I can't of course rule it out.
 Important! Because if the particle is not immutable, then it is waiting for change, which requires time.
So, if this is true, then does it mean that time does not 'flow', but rather 'chunks'?
(even at the macro level, a river that moves a specified number of molecules per time unit doesn't "flow", it's technically moving discrete "chunks" of water, we just don't say it's moving chunks of water because no one cares about the "well, technically..." in normal conversation. We know what's meant and what to ignore. Physics doesn't have that luxury)
We have no idea where that boundary can even be found; it would certainly have to be below the Planck length, but we have nothing that allows for that kind of precision. The only thing physics can say right now, and possible even ever can, is that nothing we have at our disposal allows us to conclude time "actually is" discrete. We can come up with mathematical frameworks that assume time is a discrete dimension, but even if those yield super accurate predictions that are then verified through experiment, all that does is confirm that a continuous dimension can be reduced to a discrete one without loss of precision.
“Time is what the clock measures.”
Carroll, From Eternity to Here - reasonably good and quite accessible.
Coveney, The Arrow of Time - better than average
All of my books are packed away right now for a move, so I'm sorry I can't come up with more. Though that Carroll book is really a decent entry point.
"Planck time"..the unimaginably small 5.39 × 10−44 s, is as fundamentally important to the fabric of the universe as it's much more well known "brother", the speed of light in a vacuum c.
Now, we are nowhere near able to measure that short a time frame...we are limited to around 10x-19 or so, but without a doubt time is discrete, we just are unable to get there yet.
As I understand it there are two contributing facts:
a) the Heisenberg uncertainty principle states that there's a tradeoff between certainty in position vs momentum, so if you're more certain in a particle's position you're less certain of its momentum. (For photons, momentum is proportional to frequency, i.e. wavelength, and frequency is proportional to energy.)
b) By mass-energy equivalence, anything with energy has mass, therefore higher frequency photons are more "massive". A single photon of sufficiently high frequency would form a black hole.
Putting those two together, to measure a distances accurately, you need higher and higher frequency photons with shorter and shorter wavelengths. For example, radar creates blurry images at ~5cm wavelengths, while ordinary photographs can be razor sharp at ~500nm. The Planck length is just the wavelength at which the photon would have so much energy that it would collapse into a black hole and break our current mathematical models. That's why it's nonsensical--with current models--to talk about lengths smaller than a Planck length, but it doesn't mean that space itself is quantized. Similar argument for time.
(Also, the same logic applies to other particles like electrons, protons, and even up to macroscopic items like baseballs; everything has a wavelength...)
Most certainly not! Amusingly, this is an incredibly common assumption that undegrad physics students I teach make, probably because of some pop-sci exposure.
All we know is that the Planck constant defines a scale where we do not know what happens. This is a scale at which we can see that the math behind our current theory breaks, but we have absolutely no reason to expect that the way to fix that math is to use some form of discretization related to that constant.
The "quantum" in quantum mechanics really should not be taken that literally. Photons as described by quantum mechanics for instance do not have a discrete spectrum (nothing literally "quantum" there).
There are fascinating conjectures on why maybe the Plank scale should be discrete, but they are way more subtle than "quantum mechanics is discrete" (because it is not always discrete).
It is not. It would violate special relativity. No such violation has ever been observed.
Scope: Many equate maths physics and universe all and only science. Whilst they may be “easy” science not all human observed phenomenon are reduced to them. Even left and right wing ideology?
Process-wise: whether the in principle refutatable may be naive to describe how human work (and likely we do not all do maths in axiomic way). It is a way to make a distinction of science and myth (and in maths axiom is good for proof not necessarily generate maths)
Imagine you have a system with N states, represented as a vector of length N. You have a linear operator that transforms that vector into the state in the next step of time. That's basic linear algebra. Now make the linear operator random, so you have stochastic transitions. That makes it into something called a Markov chain.
Now, physicists love differential equations. They've got lot of tools for working with them, and to bring those tool to bear, some folks a while back took a Markov chain and wrote down a set of linear differential differential equations for how the probability distribution over states evolves in time. That's called a "master equation."
Now say that you have an arbitrary function from initial state to final state. Can you describe it as a dynamics governed by a master equation? Not in general. That's not a huge surprise. We know there are lots of things you can't describe with linear differential equations.
If you're working on thermodynamics of computation, though, it would be nice to salvage the master equation framework because its relationship with thermodynamics is well understood, and you don't have to rebuild that. And getting to thermodynamics from straight up stochastic processes is beyond the mathematical abilities of most physicists. That's not a slight on physicists. The mathematical path from a stochastic process to a thermodynamics is an area of deep, difficult specialization in mathematical physics and if you go down that rabbit hole, it will likely be your career (see, for example, Elliott Lieb).
There are two ways to extend this linear world to get better approximations of things:
1. Imagine you have two points on a curve. A linear approximation is drawing a straight line between the two points. To get a better approximation, you take some more points between the two on the curve and draw a sequence of linear going through those points. The equations of each of those lines are going to be different (they have different slopes and intercepts or however you want to parameterize them). Or, in the context of master equations, you insert some additional, "hidden" steps in between your primary time steps with different stochastic matrices.
2. Imagine you have a dynamical system that moves in time steps, and depends on the last two steps for its current move. You can make it into a system that depends only on the last step by expanding the state to include the previous time step as well (that is, instead of the dynamics of x(t), I track the dynamics of (x(t), y(t)) where y(t) = x(t-1)). These are called "hidden variables." It's the same idea as hidden variables in quantum mechanics. Or even in classical mechanics, I can't write down a first order equation for the position of a classical particle...but I can write down a pair of first order equations for the position and momentum.
Lots of folks working on thermodynamics of computation have done both these things to patch their tool.
What this paper does is try to calculate how many hidden steps and how many hidden variables you need to patch the tool for a given arbitrary function that you're trying to approximate, and shows that if you use more hidden variables you need fewer hidden steps and vice versa.
For thermodynamics of computation, they point out that if you are engineering a system with master equation dynamics, you have to pay for the extra hidden states/steps that you need, so the simple statement that invertible functions are thermodynamically free and noninvertible ones require work isn't the only cost accounting to do.
Aside: I have complained before about only whiz bang articles rising on HN. Here we have a straightforward, technical calculation on the front page. Progress!