That said, dark matter is entirely possible too. There's no particular reason to think that all particles need to have an EM effect as well. In fact it's somewhat crazy to think they should. Why wouldn't there be particles that only show themselves gravitationally? Or through other perhaps insanely strong forces that for whatever evolutionary reason Earth Life is just not evolved to care about or see even indirectly.
That said, I have no training in physics and this is entirely crackpot concept. (Of http://www.goodtheorist.science/ I've only read ~6.)
The problem with this idea is that entanglement is everywhere.
The electrons inside a molecule can't be described individually, you must use the superposition of anisometric combinations of them, because only one combination is not enough. It's usually not called entanglement, but mathematically is the same phenomena.
Every time two particles collide they get entangled. If they somehow collide or they split and the new particles collide you can see the entanglement as interference. If they go far away, it's difficult to measure the correlation, but it still there. Again, it's usually not called entanglement, but mathematically is the same phenomena.
You have probably only read about cases where it's easy to characterize and isolate the superposition state of two particles that are far away. It's an interesting case because you can make some measurements and see that the classic description is wrong and prove that you must use the quantum description of the particles.
But if you make any two particles, they will be entangled, just in a non easy to guess form, so they are not nice for experiments. Or if some of them collide with a third particle now you must consider the join state of the tree particles that is more difficult (there are some nice theoretical entanglement results with three particles, but I think nobody had measure them yet).
If you see only two of the three entangled particles now the lack of some part of the information makes the result less clear, because you always must repeat the experiment many times and now you must take a weighted average of the possible states of the third particle you are not seeing. So if you are lucky you will see a smaller quantum effect, but if the state of the third particle is important enough you will see an average that is essentially equal to the classic result.
And if they collide with more particles they all get in a superposition state that is equivalent of an entangled state, but it's very difficult od impossible to control it in an experiment. So when you see some part of them they look like a usual classical system.
So after a few collisions all the particles are somewhat entangled, it's uncontrollable but mathematically there is no sharp criteria to distinguish the controllable states that are nice for entanglement experiments and the uncontrollable states where the decoherence make they look as classic system and everything in between. So if you say that two entangled particles have a distance d=0 then all the particles in the universe have d=0.
Maybe you could start off with some idealizations (a la particle in a 1D box) and perhaps some interesting limits arise. And maybe those limits could be extended statistically. Or not.
Philosophically I don't know if this question has ever been answered satisfactorily. But over the last 100 years fields have become so mathematically convenient that they're too hard not to use.
The problem Einstein and others grappled with was different, and concerned the fact that according to relativity nothing should exceed c. Entangled particles seemed to violate that principle, because the collapse to a given pair of states was instantaneous. The modern resolution to that is more practical than philosophical, and points out that information isn’t transmitted faster than light in entangled particles. Information propagates at or below c, even though at some scale entangled particles are statistically correlated. Relativity is only violated, and causality threatened if information is transmitted FTL, but entanglement doesn’t allow for that. You need a classical channel make entanglement work for communication, and that classical channel is going to be no faster than light. Without a classical channel, entanglement just gives you random noise.
But none of that answers the 18th-19th century Physicists question about how fields extend across the universe, or how light propagates in empty space. What the hell is a field anyway?
Also, it doesn't explain how the expansion of space happens, what made inflation happen, and what dark energy is.
Now, it is very easy to get hung up on this "Is it real or not" thing, and most people do for at least a while, but please just put it aside. When you peer really hard into the depth of the theory, it turns out that it is hard to say for sure that stuff is "stuff". It is tempting to suggest that having a non-zero value of mass defines "stuffness", but then how do you deal with the photo-electric effect (which makes a pretty good argument that light comes in packets that have enough "stuffness" to bounce electrons around)? All the properties that you associate with stuff are actually explainable in terms of electro-magnetic fields and mass (which in GR is described by a component of a tensor field!). And round and round we go.
Light doesn’t propagate in empty space, because space isn’t empty, it’s permeated with quantum fields in constant flux. What light doesn’t require is a medium, and a field is not a medium. The problem here is that these theories are really described in terms of a bunch of math, and when we use words to describe those theories, something is lost in translation. There are no words that fully capture what a field is in QFT, you must use math.
Once you adopt that perspective, inflation would just be the result of another field with particular properties (esp. a given range of potential energy curves). Of course, inflation is an untested theory, and there are competing ones which try to explain the apparent homogeneity and isotropy of the universe. Inflation is probably the most popular, but it doesn’t rest on the kind of firm foundation that QED does.
Was Newton ever much concerned about this? He was into alchemy and magic, even spookier actions at a distance. Plus, by all they knew at the time, why would "action at a distance" feel strange?
That theory has since been abandoned, and everyone now accepts action at a distance - which still doesn't make any sense.
Noam Chomsky - "The machine, the ghost, and the limits of understanding", 1½ hours
And for the rest of the interactions, our best models are described by Quantum Field Theory, and QFT is also emphatically "local" in this sense. Interactions between any two particles are mediated by the exchange of "virtual particles" that carry the force: photons carry electromagnetic interactions, gluons carry the strong nuclear interaction, and W and Z bosons carry the weak nuclear interaction. The detailed rules are pretty weird, but every interaction in QFT can be framed as "interactions happen when and only when particles touch each other".
(I haven't watched your Chomsky video. But it's a little odd to look to a linguist for an explanation of modern physics.)
Is the fabric is any way more real than a field?
(I didn't look to Chomsky for this; that video is just where I happened to first hear this "mechanical" idea explained, and it's still the only place.
A better citation would be the sources he cites in the video.
My main point was not about modern physics, but the history of science i.e. "mechanical".
And, for history of science, an elder respected academic is probably best to look to, anyway. It's hard to think of a specialist in history of science, because by nature it is not a hot, progressive field full of exciting new discoveries, which is what scientists usually get known for.
BTW Do have a listen to the start of the lecture, it really is quite interesting.)
I don't consider a magnetic field to be an "action at a distance" explanation at all, so I'm not sure how to answer your first question. In classical E&M, a magnetic field is a local property of space at every point. Moving electric charges affect and are affected by the magnetic field at their location in a specific way, and disturbances or changes in that field propagate outward in accordance with a local wave equation. Nothing is ever "action at a distance", because the field itself carries influences and signals from point to point. (I suppose it could look like "action at a distance" if you pretended that the field was just a made-up theoretical construct that wasn't really there. But that's very much contrary to the accepted perspective on fields at least as I understand it. Things like "fields carrying energy density" and "finite-speed propagation of field changes through empty space" seem to argue strongly for taking fields seriously.)
So I guess I'd say that the "fabric of spacetime" explanation is on roughly the same footing, though my own intuition feels like the spacetime variant is a bit more concrete, somehow. (I suspect a fair number of people would lean the other way, mind you.) Spacetime is very clearly here and real, and Einstein taught us that its structure is definitely dynamical. Adding a magnetic field is, mathematically, basically arbitrarily stapling a gauge bundle onto that spacetime manifold: it's one step more abstract and arbitrary.
Quantum mechanics is enough of flux that I certainly would not say it itself is justification for such grand claims.
Newton wrote, "It is inconceivable that inanimate Matter should, without the Mediation of something else, which is not material, operate upon, and affect other matter without mutual Contact."
Einstein wrote, "I cannot seriously believe in it because the theory cannot be reconciled with the idea that physics should represent a reality in time and space, free from spooky actions at a distance."
Newton was talking about gravity, Einstein about entanglement. But the objection is essentially the same one. That's the reason that Newton's quote above is included in an article on the principle of locality.
In my view, the crude circularity at the root of physics (which leads to multiple conundrums) is the assumption that the "speed" of light can be measured when measurement presupposes light (or its E/M equivalents). Light is special in this regards.
Proves, as in all deterministic interpretations of QM are impossible? What about MWI? Also, the wavefunction itself is deterministic.
Maybe reality isn't fundamentally particle-like, but field-like.
But then we get into metaphysical and epistemological questions which are obviously unavoidable once you start making claims about what is or is not "fundamental". This is because "fundamental" has to include both the nature of reality and our means of perceiving it. It is common today to project how we perceive onto what we perceive but they must be distinct in any valid theory.
Claiming that reality itself is probabilistic is one example of this projection, though it may very well be a fundamental limit of our means of perception it is not a property of what we perceive.
“information to be communicated faster than light”
This makes sense right? right?!?
You are mistaking gravitation for quantum entanglement which Einstein used to describe as spooky action at a distance.
> that gravitation was, in Einstein's language "spooky action at a distance"
This definitely suggests you're claiming that Einstein was talking about gravity, regardless of the literal interpretation.
- Hubble's constant has not been calculated, and when it is, there are problems.
- Like the fallacious search for Vulcan, there are modern problems with the orbits of some of the outer planets.
- A galaxy with 99 percent dark matter and one almost without any.
- The bullet cluster, the speed of rotation in galactic arms, etc.
- The apparent changing rate of expansion in the universe.
- We have no idea of the physics behind red shift "caused by the metric expansion of space".
- We still don't know what gravity actually is.
The 1998 discovery of the accelerating expansion of the universe was beautifully simple and clean. It proved a huge swath of experts dead wrong.
Dark matter/energy could be apart of the mix, but in light of the problems I would not bet it's the end of the story.
>Specifically, this paper shows that, contrary to recent assertions in the literature, the current
ephemeris for Pluto does not preclude the existence of the Pioneer effect. We show that the orbit
of Pluto is currently not well enough characterized to make such an assertion.
Interesting that both of these surprising (in the dark matter framework) observations are consistent with MOND:
What do you mean we have no idea? Redshift is one of the few simple concepts in Astrophysics, whether it's doppler, cosmological or gravitational.
Sure we do; it's the Einstein Field Equation and the initial conditions at the Big Bang.
What we don't fully understand is how the initial conditions at the Big Bang came about. The current front-runner hypothesis is some form of inflationary model (in which the Big Bang is really the end of inflation), but the question is still open.
That's very kind. Some would say that theories that can't make testable predictions, or are highly tunable, just aren't science.
But then, IANAP so hey.
I think the spiral galaxy is showing that the problem is that it still has the arms attached. After this amount of time they should not be there so either the amount of time assumed is wrong or something is modifying the speed (like dark energy, modified Newtonian dynamics, etc).
However, I don't understand what the "fall" in the triple system is about. What are they measuring that agrees with Newtonian Physics?
I'm an amateur but something feels amiss to me here. Can anyone shed more light on this contradiction?
After following the major science experiments over the last ~15 years, I've gotten the impression that the LIGO team is being very careful to not announce anything prematurely.
In an interview with Prof. Rana Adhikari about the first detection, he describes how incredibly lucky it was to detect a very strong wave in the middle of the detector's sensitivity range, only ~30 minutes after each site independently enabled data collection. That's the type of situation that raises serious questions about the validity of the data and might even suggest sabotage (someone could be faking data). Fortunately, it sounds like they were careful to investigate the alternative explanations, including searching the labs for anything that looked like a possible sabotage device. It's always good to see scientists paying extra attention to rigor.
>"Although such times are in general not included
in searches, it was determined that LHO strain
data were unaffected by the procedure at frequencies
above 30 Hz, and may thus be used to identify a GW
source and measure its properties."
I'm not saying they aren't seeing anything, but the false alarm rate calculations are meaningless unless they process the data the exact same way for background. They keep making these special exceptions every time.
Mostly just saving this for myself...
Here it is for the 5th detection:
>"Single-detector gravitational-wave triggers had never been disseminated before in low latency. Given the temporal coincidence with the Fermi-GBM GRB, however, a GCN Circular was issued at 13:21:42 UTC (LIGO Scientific Collaboration & Virgo Collaboration et al. 2017a) reporting that a highly significant candidate event consistent with a BNS coalescence was associated with the time of the GRB."
This one looks clean (but for the 1st detection the questionable info was not included in the main paper).
>"Since GW150914 had already been confirmed as a real gravitational-wave
signal , it was removed from the data when estimating the noise background."
Earlier discussion about 1st and 3rd detections:
Its claimed "significance estimates come solely from the offline analysis", which seemed to satisfy me at the time...
Theories of physics would be instantiated as software libraries. The most modern physics library would pass the vast majority of the tests, by correctly predicting the outcome Y for every configuration X. But since physics is not yet complete, there would presumably be some tests that the best current theories cannot pass, or can only pass by jumping through some really ugly hoops - such as postulating the existence of dark matter and energy.
To be taken seriously when proposing a new theory, a researcher must submit an update to the current library, and show that the new theory passes all the tests. The research can also identify new X/Y combinations, not covered in the current test suite, where the new theory diverges from the old one. Experimentalists could then decide the question by running the corresponding experiments and adding the results to the main suite.
An issue with a theory replacing QM for example, other than the high degree of precision needed, is Bell’s Theorem, which tends to rule out LHV theories. https://en.m.wikipedia.org/wiki/Bell%27s_theorem