Hacker News new | past | comments | ask | show | jobs | submit login
Troubled Times for Alternatives to Einstein’s Theory of Gravity (quantamagazine.org)
158 points by allenleein on May 5, 2018 | hide | past | web | favorite | 63 comments



I still think the answer will have to confront the problem of local realism. It was the issue that drove Newton nuts (that gravitation was, in Einstein's language "spooky action at a distance"), and Einstein resolved that through Relativity, which redefined how we think about "distance". Now the lack of local realism in QM is driving Einstein's ghost nuts. I think the resolution of all this will again redefine "distance" to take resolutions involving quantum entanglement into account. Vaguely I picture some quantum relativity theory where from the perspective of an entangled particle it's still d=0 from its entangled partner and everything else is splitting apart. So like relativity, it's all about which perspective you take. Then perhaps with these relative distances, and however the math falls out with some kind of quantum Lorentz transforms maybe we see that in galaxies with lots of entangled particles this can slow down rotation.

That said, dark matter is entirely possible too. There's no particular reason to think that all particles need to have an EM effect as well. In fact it's somewhat crazy to think they should. Why wouldn't there be particles that only show themselves gravitationally? Or through other perhaps insanely strong forces that for whatever evolutionary reason Earth Life is just not evolved to care about or see even indirectly.

That said, I have no training in physics and this is entirely crackpot concept. (Of http://www.goodtheorist.science/ I've only read ~6.)


> I think the resolution of all this will again redefine "distance" to take resolutions involving quantum entanglement into account. Vaguely I picture some quantum relativity theory where from the perspective of an entangled particle it's still d=0 from its entangled partner and everything else is splitting apart.

The problem with this idea is that entanglement is everywhere.

The electrons inside a molecule can't be described individually, you must use the superposition of anisometric combinations of them, because only one combination is not enough. It's usually not called entanglement, but mathematically is the same phenomena.

Every time two particles collide they get entangled. If they somehow collide or they split and the new particles collide you can see the entanglement as interference. If they go far away, it's difficult to measure the correlation, but it still there. Again, it's usually not called entanglement, but mathematically is the same phenomena.

You have probably only read about cases where it's easy to characterize and isolate the superposition state of two particles that are far away. It's an interesting case because you can make some measurements and see that the classic description is wrong and prove that you must use the quantum description of the particles.

But if you make any two particles, they will be entangled, just in a non easy to guess form, so they are not nice for experiments. Or if some of them collide with a third particle now you must consider the join state of the tree particles that is more difficult (there are some nice theoretical entanglement results with three particles, but I think nobody had measure them yet).

If you see only two of the three entangled particles now the lack of some part of the information makes the result less clear, because you always must repeat the experiment many times and now you must take a weighted average of the possible states of the third particle you are not seeing. So if you are lucky you will see a smaller quantum effect, but if the state of the third particle is important enough you will see an average that is essentially equal to the classic result.

And if they collide with more particles they all get in a superposition state that is equivalent of an entangled state, but it's very difficult od impossible to control it in an experiment. So when you see some part of them they look like a usual classical system.

So after a few collisions all the particles are somewhat entangled, it's uncontrollable but mathematically there is no sharp criteria to distinguish the controllable states that are nice for entanglement experiments and the uncontrollable states where the decoherence make they look as classic system and everything in between. So if you say that two entangled particles have a distance d=0 then all the particles in the universe have d=0.


This is somewhat my fear too. But, I can't say it would put me off entirely. Special Relativity could easily be dismissed with a few simple contradictions until you look at it closer and realize there's a reconciliation. QM itself is also fairly hairy beyond very specific ideal cases, yet it's the most successful scientific theory in history.

Maybe you could start off with some idealizations (a la particle in a 1D box) and perhaps some interesting limits arise. And maybe those limits could be extended statistically. Or not.


I have heard some physicist friends of mine speculate that perhaps space and time themselves are just statistical ensembles of entanglement. At the macro scale, we can measure distances and read clocks, but at the micro scale it’s like trying to measure the temperature of a single electron. Perhaps the definition of “close by” is just two states that are highly entangled. Since entanglement is contagious, this leads to clustering. It’s not clear why this would lead to 4 dimensional space time, and I’m not a physicist, but I thought it was an interesting idea.


Your idea reminds me of Wheeler's "single electron universe" [1]. I think you're on to something but I also think the next big step will involve a formal (re)definition of time as well as distance.

[1] https://en.wikipedia.org/wiki/One-electron_universe


Indeed, we seem to treat time like a passive component, a constantly ticking clock, where I wonder if time is distance, and it's time that's expanding our apparent distance. We measure distance in terms of time, or in terms of energy required to cross a given distance in a given time, and I can't shake the idea that energy required is exactly time, and distance itself is the meaningless measurement. It's energy required in 3D that's expanding. That doesn't mean space is expanding, only that energy required in 3D is expanding. And, obviously, I'm not a physicist. If I ever get a second chance at school, maybe physics.


Well if someone ever bothers to go through all the math this thing would entail, I think this would eventually come to some physically testable experiment. Whereas Wheeler's thing seems more pie in the sky.


When you think about particles as excitations of a field, the indistinguishable nature of said particles stops being such a puzzle.


But the concept of a field itself was troubling to 18-19th century physicists. "So you put this (mass/charge/entanglement/whatever) here and automagically there's this whole force that extends across the universe in zero time? How? Why?"

Philosophically I don't know if this question has ever been answered satisfactorily. But over the last 100 years fields have become so mathematically convenient that they're too hard not to use.


Well those 18th-19th century anything didn’t have a concept of entanglement or FTL, they were struggling with the notion that light, gravity, and other forces could propagate without a medium. They saw that waves could be transmitted through water, sound through air, and the idea that light or gravity could be transmitted through vacuum was baffling. Their resolution to the issue was to propose the existence of a substance which was invisible, intangible, but which permeated everything: the luminiferous ether. Light, in their understanding, wasn’t really moving through a vacuum, but this ever-present “stuff” instead. That started to fall out of favor long after the 18th century, and was finally conclusively shot down by Michelson and Morley with their interferometry.

The problem Einstein and others grappled with was different, and concerned the fact that according to relativity nothing should exceed c. Entangled particles seemed to violate that principle, because the collapse to a given pair of states was instantaneous. The modern resolution to that is more practical than philosophical, and points out that information isn’t transmitted faster than light in entangled particles. Information propagates at or below c, even though at some scale entangled particles are statistically correlated. Relativity is only violated, and causality threatened if information is transmitted FTL, but entanglement doesn’t allow for that. You need a classical channel make entanglement work for communication, and that classical channel is going to be no faster than light. Without a classical channel, entanglement just gives you random noise.


> Relativity is only violated, and causality threatened if information is transmitted FTL, but entanglement doesn’t allow for that. You need a classical channel make entanglement work for communication, and that classical channel is going to be no faster than light. Without a classical channel, entanglement just gives you random noise.

But none of that answers the 18th-19th century Physicists question about how fields extend across the universe, or how light propagates in empty space. What the hell is a field anyway?

Also, it doesn't explain how the expansion of space happens, what made inflation happen, and what dark energy is.


A very smart guy named Alfred Korzybski said, “The map is not the territory.” It’s important to not get too hung up on epistemological implications of our models of reality, because they are just models. On a practical level a field is something which has a value at every point, that value can be a scalar, vector, or tensor depending on the nature of the field. For some good discussion I think this could help: https://physics.stackexchange.com/questions/13157/what-is-a-...

Now, it is very easy to get hung up on this "Is it real or not" thing, and most people do for at least a while, but please just put it aside. When you peer really hard into the depth of the theory, it turns out that it is hard to say for sure that stuff is "stuff". It is tempting to suggest that having a non-zero value of mass defines "stuffness", but then how do you deal with the photo-electric effect (which makes a pretty good argument that light comes in packets that have enough "stuffness" to bounce electrons around)? All the properties that you associate with stuff are actually explainable in terms of electro-magnetic fields and mass (which in GR is described by a component of a tensor field!). And round and round we go.

Light doesn’t propagate in empty space, because space isn’t empty, it’s permeated with quantum fields in constant flux. What light doesn’t require is a medium, and a field is not a medium. The problem here is that these theories are really described in terms of a bunch of math, and when we use words to describe those theories, something is lost in translation. There are no words that fully capture what a field is in QFT, you must use math.

Once you adopt that perspective, inflation would just be the result of another field with particular properties (esp. a given range of potential energy curves). Of course, inflation is an untested theory, and there are competing ones which try to explain the apparent homogeneity and isotropy of the universe. Inflation is probably the most popular, but it doesn’t rest on the kind of firm foundation that QED does.


>It was the issue that drove Newton nuts (that gravitation was, in Einstein's language "spooky action at a distance")

Was Newton ever much concerned about this? He was into alchemy and magic, even spookier actions at a distance. Plus, by all they knew at the time, why would "action at a distance" feel strange?


A mechanical universe was the prevailing theory at the time, because nothing else made sense. "Mechanical" meaning that for one thing to cause another, it had to be in physical contact (like the teeth of a cog).

That theory has since been abandoned, and everyone now accepts action at a distance - which still doesn't make any sense.

Noam Chomsky - "The machine, the ghost, and the limits of understanding", 1½ hours https://youtube.com/watch?v=D5in5EdjhD0 [2012]


We don't actually buy in to action at a distance anymore, not really. In Einstein's general relativity, the "fabric of spacetime" is an active participant in the dynamics of the universe, and for one object to affect another by gravity its influence has to ripple through spacetime to reach its target. So it's very thoroughly a "local" theory.

And for the rest of the interactions, our best models are described by Quantum Field Theory, and QFT is also emphatically "local" in this sense. Interactions between any two particles are mediated by the exchange of "virtual particles" that carry the force: photons carry electromagnetic interactions, gluons carry the strong nuclear interaction, and W and Z bosons carry the weak nuclear interaction. The detailed rules are pretty weird, but every interaction in QFT can be framed as "interactions happen when and only when particles touch each other".

(I haven't watched your Chomsky video. But it's a little odd to look to a linguist for an explanation of modern physics.)


I don't know much about this, so how does the "fabric of spacetime" compare to "field" explanations of action-at-a-distance? e.g. a magnetic field accurately predicts observations, but doesn't actually explain them.

Is the fabric is any way more real than a field?

(I didn't look to Chomsky for this; that video is just where I happened to first hear this "mechanical" idea explained, and it's still the only place. A better citation would be the sources he cites in the video.

My main point was not about modern physics, but the history of science i.e. "mechanical".

And, for history of science, an elder respected academic is probably best to look to, anyway. It's hard to think of a specialist in history of science, because by nature it is not a hot, progressive field full of exciting new discoveries, which is what scientists usually get known for.

BTW Do have a listen to the start of the lecture, it really is quite interesting.)


Very belated reply:

I don't consider a magnetic field to be an "action at a distance" explanation at all, so I'm not sure how to answer your first question. In classical E&M, a magnetic field is a local property of space at every point. Moving electric charges affect and are affected by the magnetic field at their location in a specific way, and disturbances or changes in that field propagate outward in accordance with a local wave equation. Nothing is ever "action at a distance", because the field itself carries influences and signals from point to point. (I suppose it could look like "action at a distance" if you pretended that the field was just a made-up theoretical construct that wasn't really there. But that's very much contrary to the accepted perspective on fields at least as I understand it. Things like "fields carrying energy density" and "finite-speed propagation of field changes through empty space" seem to argue strongly for taking fields seriously.)

So I guess I'd say that the "fabric of spacetime" explanation is on roughly the same footing, though my own intuition feels like the spacetime variant is a bit more concrete, somehow. (I suspect a fair number of people would lean the other way, mind you.) Spacetime is very clearly here and real, and Einstein taught us that its structure is definitely dynamical. Adding a magnetic field is, mathematically, basically arbitrarily stapling a gauge bundle onto that spacetime manifold: it's one step more abstract and arbitrary.


We still very much view the universe as mechanical. Gravity, for instance, is still seen as the bending of spacetime. The typical analogy being a bowling ball on a trampoline. The weighty bowling ball bends the fabric of spacetime causing 'attraction' to anything nearby.

Quantum mechanics is enough of flux that I certainly would not say it itself is justification for such grand claims.



This has very little to do with the modern issue of action at a distance vis a vis entanglement, and everything to do with the prevailing belief in an aether during Newton’s time. Einstein meanwhile, was actually commenting on entanglement physics.


But Einstein's objection was almost identical to Newton's.

Newton wrote, "It is inconceivable that inanimate Matter should, without the Mediation of something else, which is not material, operate upon, and affect other matter without mutual Contact."

Einstein wrote, "I cannot seriously believe in it because the theory cannot be reconciled with the idea that physics should represent a reality in time and space, free from spooky actions at a distance."

Newton was talking about gravity, Einstein about entanglement. But the objection is essentially the same one. That's the reason that Newton's quote above is included in an article on the principle of locality.


Einstein was talking about violations of his theory, namely the velocity limitation imposed, Newton was concerned with a medium. They are utterly different. Maybe they sound similar without any knowledge of the underlying theories, or just take quotes out of context, but it’s a very superficial similarity.


Yeah I'm not trying to imply they're the same physical cause, just the same philosophical perception of ickiness. How to phrase better? Newton had issues with aether messing philosophically with local realism. Einstein fixed that through SR and then GR. But then QM introduced new problems with local realism, and maybe the next physics revolution is reconciling that problem. So, not the same thing, but perhaps two different sides of the same million-sided armchair-philosophical dice.


I agree. I think people miss your point that within their respective contexts Newton and Einstein were rejecting a form of an acausal theory. However, there is a significant philosophic difference in Newton's era vs Einstein's. Newton was honest in his writing essentially admitting defeat in solving how gravity could work instantaneously across space but nevertheless the math works. In contrast, today the physicists (and philosophers) claim not only does the math work but it proves that reality is fundamentally acausal, or probabilistic, etc. which are various forms of denying identity.

In my view, the crude circularity at the root of physics (which leads to multiple conundrums) is the assumption that the "speed" of light can be measured when measurement presupposes light (or its E/M equivalents).[1] Light is special in this regards.

[1] https://fqxi.org/community/forum/topic/3104


> not only does the math work but it proves that reality is fundamentally acausal, or probabilistic, etc. which are various forms of denying identity.

Proves, as in all deterministic interpretations of QM are impossible? What about MWI? Also, the wavefunction itself is deterministic.

Maybe reality isn't fundamentally particle-like, but field-like.


> Maybe reality isn't fundamentally particle-like, but field-like.

But then we get into metaphysical and epistemological questions which are obviously unavoidable once you start making claims about what is or is not "fundamental". This is because "fundamental" has to include both the nature of reality and our means of perceiving it. It is common today to project how we perceive onto what we perceive but they must be distinct in any valid theory.

Claiming that reality itself is probabilistic is one example of this projection, though it may very well be a fundamental limit of our means of perception it is not a property of what we perceive.


With a true 'd = 0' for entangled particuls you could have faster than light communication no? But entanglement as far as we know now doesn't allow FTL communications.


Kinda the opposite. If you redefine what "distance" means, that's explicitly how you's avoid FTL communication and eliminate the EPR paradox. Again though, I really am not an expert, or even more than armchair knowledgeable, on the subject.


On their own, entangled particles do not allow information to be communicated.


“Information to be communicated”

or

“information to be communicated faster than light”

?


neither


Entangled particles just share the same ref, distance is a function of lookup complexity.

This makes sense right? right?!?


> that gravitation was, in Einstein's language "spooky action at a distance"

You are mistaking gravitation for quantum entanglement which Einstein used to describe as spooky action at a distance.


No, that was intentional. They were both concerned with the lack of local realism delineated in the physics of their time, each of which occurred for different reasons (and the latter of which we still live with). Einstein just had a more memorable phrase for it. I can see how that was misleading though.


The way you've written it is very misleading.

> that gravitation was, in Einstein's language "spooky action at a distance"

This definitely suggests you're claiming that Einstein was talking about gravity, regardless of the literal interpretation.


:%s/"in Einstein's language"/"as Einstein may have phrased it"/g


The problem is that locality, realism and “spukhafte fernwirking” are all issues of quantum mechanics and not relativistic physics. You weren’t misleading, you were mistaken.


But in the 1800s they were issues of Newton's theory of gravitation.


this is indeed a very interesting idea. http://blog.stephenwolfram.com/2015/12/what-is-spacetime-rea... was describing something similar several years ago.


My money is still on additions/modifications to relativity in the next 100 years. (Sorry for the sources, just using Google to pull up something.)

- Hubble's constant has not been calculated, and when it is, there are problems. https://www.space.com/38496-hunt-for-hubble-constant-standar...

- Like the fallacious search for Vulcan, there are modern problems with the orbits of some of the outer planets. https://solarsystem.nasa.gov/planets/hypothetical-planet-x/i...

- A galaxy with 99 percent dark matter and one almost without any. https://www.quantamagazine.org/a-victory-for-dark-matter-in-... https://www.theverge.com/2016/8/25/12647540/dragonfly-galaxy...

- The bullet cluster, the speed of rotation in galactic arms, etc.

- The apparent changing rate of expansion in the universe.

- We have no idea of the physics behind red shift "caused by the metric expansion of space".

- We still don't know what gravity actually is.

The 1998 discovery of the accelerating expansion of the universe was beautifully simple and clean. It proved a huge swath of experts dead wrong. https://en.wikipedia.org/wiki/2011_Nobel_Prize_in_Physics

Dark matter/energy could be apart of the mix, but in light of the problems I would not bet it's the end of the story.


Your second point - I don't think the potential Planet Nine has anything to do with "problem with the orbits of some of the outer planets". It has been proposed as a solution to the orbits of some distant smaller Kuiper Belt objects. This in return doesn't really have anything to do with relativity, just basic orbital mechanics.


I could have sworn there were problems with Pluto's orbit where it was faster/slower than it should have been, but it looks like I'm wrong.

>Specifically, this paper shows that, contrary to recent assertions in the literature, the current ephemeris for Pluto does not preclude the existence of the Pioneer effect. We show that the orbit of Pluto is currently not well enough characterized to make such an assertion.

https://arxiv.org/pdf/0905.0030.pdf


The Pioneer Effect has been explained as "anisotropic radiation pressure caused by the spacecraft's heat loss".

https://en.wikipedia.org/wiki/Pioneer_anomaly


That's an interesting case because they "solved" the problem by basically increasing the estimated uncertainty. See the dual sets of error bars in Fig 3: https://arxiv.org/abs/1204.2507


> "A galaxy with 99 percent dark matter and one almost without any. https://www.quantamagazine.org/a-victory-for-dark-matter-in-.... https://www.theverge.com/2016/8/25/12647540/dragonfly-galaxy...

Interesting that both of these surprising (in the dark matter framework) observations are consistent with MOND:

https://arxiv.org/abs/1804.04167

https://arxiv.org/abs/1610.06189


> We have no idea of the physics behind red shift "caused by the metric expansion of space".

What do you mean we have no idea? Redshift is one of the few simple concepts in Astrophysics, whether it's doppler, cosmological or gravitational.


I could have better said, "We don't understand the physical causes for the metric expansion of space". It's the physical reasons for the expansion we don't understand.


> It's the physical reasons for the expansion we don't understand.

Sure we do; it's the Einstein Field Equation and the initial conditions at the Big Bang.

What we don't fully understand is how the initial conditions at the Big Bang came about. The current front-runner hypothesis is some form of inflationary model (in which the Big Bang is really the end of inflation), but the question is still open.


The bullet cluster is famously an object which is very hard or impossible without having some dark matter like substance. We observe the mass and gas are offset from each other in a collision.


> Theorists have dozens of alternative gravity theories that could potentially explain dark matter and dark energy, Freire said. Some of these theories can’t make testable predictions, Archibald said, and many “have a parameter, a ‘knob’ you can turn to make them pass any test you like,” she said. But at some point, said Nicolas Yunes, a physicist at Montana State University, “this gets silly and Occam’s razor wins.”

That's very kind. Some would say that theories that can't make testable predictions, or are highly tunable, just aren't science.


Right now that includes Dark Matter...


Well, it's always seemed ad hoc to me.

But then, IANAP so hey.


Not really. Lambda CDM works very well in lots of separate situations with the same parameters and has predicted observations correctly.


The article shows two demonstrations: 1) a spiral galaxy and 2) a triple system (a neutron star and a white dwarf orbit each other, with another white dwarf orbiting the pair at a distance).

I think the spiral galaxy is showing that the problem is that it still has the arms attached. After this amount of time they should not be there so either the amount of time assumed is wrong or something is modifying the speed (like dark energy, modified Newtonian dynamics, etc).

However, I don't understand what the "fall" in the triple system is about. What are they measuring that agrees with Newtonian Physics?


Are we reading the same FA? The one I read was about a neutron star collision/merger, and a triple star system with a pulsar, and also a double-pulsar system. The word 'spiral' does not appear.


To anyone reading the comments, don't bother.


This article discusses LIGO's detection of gravitational waves as a problem for alternate theories of gravity. But the claim of that discovery seems to have been withdrawn by the LIGO team. Here: https://www.nature.com/news/gravitational-waves-discovery-no...

I'm an amateur but something feels amiss to me here. Can anyone shed more light on this contradiction?


That article is about a redaction from the research team at BICEP2, a telescope in the South Pole. It has nothing to do with the Super LIGO detector, which has detected gravitational waves from black hole collisions several times now.


LIGO's comments[1] about the 6 gravitational wave detections they are currently confirming (and related publications):

[1] https://www.ligo.org/detections.php

After following the major science experiments over the last ~15 years, I've gotten the impression that the LIGO team is being very careful to not announce anything prematurely.

In an interview[2] with Prof. Rana Adhikari about the first detection, he describes how incredibly lucky it was to detect a very strong wave in the middle of the detector's sensitivity range, only ~30 minutes after each site independently enabled data collection. That's the type of situation that raises serious questions about the validity of the data and might even suggest sabotage (someone could be faking data). Fortunately, it sounds like they were careful to investigate the alternative explanations, including searching the labs for anything that looked like a possible sabotage device. It's always good to see scientists paying extra attention to rigor.

[2] https://www.youtube.com/watch?v=ViMnGgn87dg


Wow, I just read about the latest gravitational wave and I see that once again they detected it by deviating from the procedure used to generate the background model:

>"Although such times are in general not included in searches, it was determined that LHO strain data were una ffected by the procedure at frequencies above 30 Hz, and may thus be used to identify a GW source and measure its properties." https://www.ligo.org/detections/GW170608/paper/GW170608_subm...

I'm not saying they aren't seeing anything, but the false alarm rate calculations are meaningless unless they process the data the exact same way for background. They keep making these special exceptions every time.

EDIT:

Mostly just saving this for myself...

Here it is for the 5th detection:

>"Single-detector gravitational-wave triggers had never been disseminated before in low latency. Given the temporal coincidence with the Fermi-GBM GRB, however, a GCN Circular was issued at 13:21:42 UTC (LIGO Scientific Collaboration & Virgo Collaboration et al. 2017a) reporting that a highly significant candidate event consistent with a BNS coalescence was associated with the time of the GRB."

http://iopscience.iop.org/article/10.3847/2041-8213/aa91c9

4th detection:

This one looks clean (but for the 1st detection the questionable info was not included in the main paper).

https://www.ligo.org/detections/GW170814/paper/GW170814-PRLp...

2nd detection:

>"Since GW150914 had already been confirmed as a real gravitational-wave signal [4], it was removed from the data when estimating the noise background."

http://link.aps.org/doi/10.1103/PhysRevLett.116.241103

Earlier discussion about 1st and 3rd detections:

Its claimed "significance estimates come solely from the offline analysis", which seemed to satisfy me at the time... https://news.ycombinator.com/item?id=14462719


I thought they had confirmed LIGO's detection of gravitational waves coming from a certain pulsar that Fermi saw independently. The article you link to seems to be about gravitational waves from the background. I'm also curious now.


I've been thinking recently that physics should import an idea from software engineering: the test harness. For physics, the test harness would include a large database of X/Y pairs where X represents an experimental configuration and Y represents the outcome.

Theories of physics would be instantiated as software libraries. The most modern physics library would pass the vast majority of the tests, by correctly predicting the outcome Y for every configuration X. But since physics is not yet complete, there would presumably be some tests that the best current theories cannot pass, or can only pass by jumping through some really ugly hoops - such as postulating the existence of dark matter and energy.

To be taken seriously when proposing a new theory, a researcher must submit an update to the current library, and show that the new theory passes all the tests. The research can also identify new X/Y combinations, not covered in the current test suite, where the new theory diverges from the old one. Experimentalists could then decide the question by running the corresponding experiments and adding the results to the main suite.


This is already essentially how it works. A new theory has to be complementary to old ones. This is often said in terms of “matching the predictions of...” another theory. There are more standards of course, but that’s the baseline. A lot of pet theories people have for things like QM fall apart because they don’t match the tested predictions of QM. For example, Special and General Relativity make new and more accurate predictions which complement, but do not contradict Newtonian theory.

An issue with a theory replacing QM for example, other than the high degree of precision needed, is Bell’s Theorem, which tends to rule out LHV theories. https://en.m.wikipedia.org/wiki/Bell%27s_theorem




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: