The vast majority of it is directly from a pop-science NOVA episode and not actually well-backed.
There are reasonable bounds you can place on energy density where we expect current physical theories to stop making sense.
But energy density is not the same as temperature. It is true that for things like ideal gases, temperature is roughly "energy per degree-of-freedom", which is an energy density of sorts, but that's not fundamentally what temperature is.
Temperature is nothing more than a specific measure of how energy will flow due to entropic effects. In the right systems, this can be arbitrarily high without a high energy density. (In fact, elsewhere on this very post, people have pointed out "negative temperatures" where the temperatures become "hotter than infinity", they "wrap around" to negative.)
https://en.wikipedia.org/wiki/Negative_temperature is not terrible, for an overview, though the disclaimer is just annoying at this level.
For thinking about this point, it's much easier to talk about "thermodynamic beta" (sometimes called "coolness" or "coldness") which is just 1/T = partial S/partial E. The behavior of a spin system that admits negative temperatures can be described smoothly in terms of beta -- hotter systems have beta that is lower, and zero is not particularly special.
Now, any real system is coupled to the rest of the environment, so can't be in equilibrium at a negative temperature, as it would continuously leak heat until it cooled down enough to have some positive temperature. But if its internal equilibration proceeds much faster, then it's still useful to talk about its temperature as a quasi-equilibrium case.
> Above about 10^32K, particle energies become so large that gravitational forces between them would become as strong as other fundamental forces according to current theories.
I see, the gravitation would become a problem even before the speed.
You can't accelerate anything _with mass_ to the speed of light. Although I guess that stuff with no mass already travels at the speed of light, so you wouldn't need to accelerate it.
For v<<c you have significant discrepancies on the energy required to accelerate to a specific speed with regard to mass. I.E. Newton: Ek=(mv^2)/2
As v->c it does not matter as much, the lorentz factor is much more significant, the mass operates just as a base multiplier and sum factor.
As v->c, x->0 where Ek~1/x, i.e. tending to infinity with a division by zero when v=c.
In conclusion, the speed is the relevant factor instead of mass when near speed of light, regardless of the object being an electron or the mount Everest.
Of course, assuming the equation holds ;-)
For a moving object you could then m=(Er+Ek)/c^2, which creates the impression that the mass is variable (as the term Ek is zero when at rest and increasing with velocity), giving rise to the terms 'rest mass' and 'relativistic mass' respectively for the rest energy and total energy equations.
This interpretation is somewhat outdated but the terminology rest mass maintains its legacy. One could refer to it as the `(proper |invariant |intrinsic )?mass` instead.
The variable mass issue is then 'solved' by 'refactoring' the equation to use momentum where mass is coupled with velocity, over which the complexity of the lorentz factor is engulfed.
If we cancelled out that movement (relative to the Milky Way), does the mass change?
Is how much do the various relative movements affect our mass, and would it be possible to pull tem apart? (Solar System, Relative to galactic center, other galaxies, etc)
What I understand is that the contemporary conception is that the mass does not change.
I just presented the argument for a notion of rest mass and relative mass.
But the faster and faster you go, the more energy is required. As the number approaches 100%, the energy required for each tiny fractional step goes up exponentially, so much so that it is impossible to accelerate either object to 100% of the speed of light. It just requires more, and more, and more energy.
Never. More precisely, never for particles of ordinary matter that have nonzero rest mass. Relativistic effects change the dependence of temperature on velocity (more precisely, the dependence of kinetic energy on velocity), so that kinetic energy/temperature increases without bound as the speed of light is approached.
For a "gas" of photons, particles of light, the particles always move at the speed of light, because they have zero rest mass. But photons can have any kinetic energy, so a photon gas can have any finite temperature.
Suppose you have a system with 100 degrees of freedom and 2 units of energy, spread out as (0.01, 0.01, ..., 0.01, 1.01). A bunch of its energy is in one of those hundred degrees of freedom. You can assign it two different temperatures: the temperature 0.01, which would describe how energy will right now flow into the system if you connect it to another system with a bunch of degrees of freedom with their own thermal energy (assuming that the 1.01 degree of freedom is "internal" and doesn't interact directly with the outside world), and the temperature 0.02, which would describe how energy will eventually be spread out and hence how it would eventually share freedom with the outside world.
Temperature is ultimately defined in terms of how our uncertainty about the microscopic state a system is in changes as we add energy to that system. The higher this rate of change of uncertainty, the lower the temperature is -- this is why when you connect two systems of different temperatures, in the process of us becoming more uncertain about the fundamental state of the world, energy "spontaneously" flows from the higher temperature to the lower temperature: the certainty gained from stealing energy from the higher-T one is more than compensated by uncertainty created from pouring that same energy into the lower-T one. (In fact there is a family of systems of "negative temperature" which become less uncertain as you add more energy to them: they are "hotter than the hottest possible temperature" because they will gladly give their energy to any "normal" system in the process of us becoming more uncertain about the world.)
The problem is that if we're certain that some degree of freedom has a given amount of energy that's "special", we have a bunch of different definitions of "temperature" depending on how "adding energy to the system" distributes between the "special" degree of freedom and the "thermal" degrees of freedom.
So the usual process is to just totally separate those degrees of freedom as separate systems, the "thermal" ones have a temperature, the "special" ones do not.
I'm no physicist, just a chemist. What are they?
The classical example is if you have a bunch of magnetic moments in a magnetic field and they do not interact with each other: then stuffing energy into the system requires aligning them against the magnetic field, and this makes the state more ordered. The problem is that these moments are generally in thermal contact with some apparatus that keeps them in place or vibrational degrees of freedom of their centers of mass or so. But you can get this thing to happen in magnetic resonance setups.
Negative temperature states pop up in a lot of strange places, the two that I know more closely are that lasing has this property of "as I dump more energy into the system I get more bosons in the lasing state" and Onsager in 1949 published a little article called “Statistical Hydrodynamics” which sort of went viral for the time, it points out that there is a way to view the instability of turbulent systems as due to negative temperature regimes of the vortices in those systems.
In physics we talk about the "degrees of freedom" of a system -- this is just the count of all of the independent ways that it can move. For each degree of freedom of a system you can calculate the average energy in that degree of freedom. By the equipartition theorem, at thermal equilibrium, all the degrees of freedom will have the same average energy, which will be T (if you measure temperature in units of energy).
So if you think about dropping a bouncy ball in a tube and it bounces until it slowly comes to rest, it has these degrees of freedom -- the internal degrees of freedom of the atoms of the ball, the internal degrees of freedom of the atoms of the floor/tube -- and then two really obvious degrees of freedom, the center-of-mass position of the ball, which gains an energy scale due to the gravitational force, and the center-of-mass momentum of the ball, which trades energy with this position degree-of-freedom.
Statistical mechanics says that as this system progresses, the location of the energy will slowly become more uncertain until it is on-average-evenly distributed across all of the degrees of freedom. That's why it bounces lower and lower: there is so much energy in the two "main" degrees of freedom -- maybe half a joule? -- whereas in the vibrations there is something closer to 10^-21 J of energy at room temperature.
But the flip side of dissipation is always fluctuation -- this is in fact the subject of a major theorem! So the fact that this can randomly lose energy to these other degrees of freedom means that those degrees of freedom are also randomly kicking the ball. As you can imagine with ~20 orders of magnitude difference between the two, they don't kick this ball by all that much. But you have a lot of experience with a lot more tiny balls that are bouncing off the ground all the time. Take a deep breath. There they are.
If everything were to come to its minimum energy configuration, why are these air molecules so stubbornly not falling to the floor? Well, they are trying to! But they are so light that they are being kicked back upwards by these random thermal kicks, so high that they can in principle go the many kilometers to the uppermost atmosphere.
(Of course if they could go all that way in a single kick then air would have to be so non-interactive that we could not use it to talk to each other... the mean free path in air is actually about 68 nm, so in practice every air atom is getting its random thermal kicks from other nearby air atoms. But the ultimate origin of these random thermal kicks is the random kicks of the floor on the few hundred nanometers of air sitting above it, and that energy comes from the Sun and is mostly conserved as these atoms collide with each other -- but a tiny bit is often converted to little photons of infrared light that sometimes escape the atmosphere.)
With that said as others have noticed, the free-particle energy relation in special relativity is E = γ m c². Famously, at rest, this factor γ = 1/√(1 − (v/c)²) is 1 and the energy of a particle at rest is E = m c². But as v gets closer and closer to c, v → c, this energy grows without boundary, E → ∞. So there is no finite temperature where a kinetic degree of freedom would exceed the speed of light. Indeed you can solve for v, as 1/γ² = 1 − (v/c)². So the velocity corresponding to any given total energy is v = c √(1 − (mc²/E)²). For a rest particle with E = mc² this is v = 0 as you would expect; or when the kinetic energy first gets to mc² we would have E = 2mc² and thus v = c √(3/4) = 0.866 c.
Exceeding c is, of course, not known to be possible at all, even with infinite energy.
Think of the quantum vacuum as having a large number of degrees of freedom waiting to get excited by energy -- like a fleet of unused AWS instances in a system with very effective load balancing. The moment the load (roughly, energy) on the running instances (particle present in the system aka "quanta") increases beyond the threshold for creating a new one (aka rest mass of a new particle), a new instance is spontaneously created. Heating the system is akin to increasing the load on your system, and new instances will keep getting spun up.
Is there a limit on how many such particle instances can be created? If we neglect gravity, no -- you can just keep adding instances/quanta and never run out. (and how much ever energy you dump in, the system's temperature will not increase beyond the Hagedorn limit )
But if you stop ignoring gravity, the gravitational attraction between the spun up instances will keep increasing as you spin up more of them, eventually forming a black hole at some point (because you cannot squeeze in more than a certain amount of information in a given volume ). This is roughly where you wave your hands and and come up with heuristic explanations using Planck length, Planck mass, etc.
That's the limit of current understanding. Any refinement to this story would be a massive breakthrough!
PS: A relatively sobering (nonetheless exciting) possibility is that much before gravitational effects become important, your "effective field theory" proves insufficient to model the system, and you are led to a "more fundamental" model.
: A technical explanation of the Hagedorn limit: At finite temperature, the occupation probability of states is exponentially decaying with energy (i.e. energy divided by temperature gives the log-probability) . But, if the degeneracy of high-energy states grows exponentially, then that could entropically compensate for the exponential decay of the occupation probability, to have more occupation at higher energies than lower energies! The transition point in this tradeoff is the Hagedorn limit. That is why, additional energy is more likely to create new particles/states than simply increase the per-particle energy of the existing ones.
Much later, I learned about that very thing (1/T, “thermodynamic beta”) while wikiwalking after hearing about the concept of negative temperature. Then I fell into a rabbit-hole wondering if we also measure speed upside-down in that way, since you can’t accelerate to light-speed. And indeed, when you’re talking about relativistic effects, units of time per distance can be more illustrative sometimes than our intuition of distance per time.
Black-body radiation describes the distribution of power emitted at given wavelengths or frequencies emitted by an ideal black body.
1. Distributions don't have one wavelength. There is a "peak" wavelength at which most power-per-wavelength is emitted, but there is still power emitted at all wavelengths no matter what temperature something is.
2. There is not sufficient reason to believe that the Planck length is a limit on the wavelength of light. This would break Lorentz symmetry.
3. Most things aren't ideal black bodies, but still have temperatures. Even if very hot objects couldn't emit light of Planck length, they can still couple to the environment and emit heat in other ways.
... Also known as the Planck Temperature, which is the highest listed possibility for absolute hot in TFA.
Our physics and understanding of matter seems to be relevant under very specific conditions.
The moment we can make technology that can impact the smallest of sizes(if this is even possible), we might get an answer for what the universe is. Or maybe it would turn out to be "42" and it still wouldnt make sense.
We are talking about the theoretical limit of temperature, but what is the practical limit? There's a point beyond which heating hydrogen just gets you helium and more heat, but heating anything heavier than iron gets you something colder than the inputs.
Has anyone ever done an experiment to confirm that SR comes into play at ultra-high temperatures?
It is an intentionally simplified version of English. Although, in this case, the meaning of the article seems to have been affected by the simplification. The English version makes it clear that this is a theoretical concept, whereas the Simple English version makes it sound like something concrete/absolute.
Now, should this thread ultimately hehehe a discussion on the virtues and approaches to creating mixtapes it would be another thing, but at this point in time I'm not seeing this as a positive contribution to discussion. That is why I downvoted your genuinely amusing comment.
There's plenty of discussion here going into the details of the physics and the semantics at hand. We don't have to lower the bar for the discussion just because the topic under discussion was presented in a simple way.
If you're looking exclusively for mindless fun, you're really in the wrong place. And there's nothing wrong with enjoying that kind of thing either, there are just better places to do it than HN.