We have been looking really hard for answers to persistent questions. While we have not found affirmative answers, physicists have systematically ruled out option after option after option. We have not yet discovered a unified-theory-of-everything, but we know a whole lot more about what that theory is not.
Furthermore, the past 40 years have seen the emergence of precision cosmology (and the dark-matter/energy paradigm that it entails), the observation and confirmation of neutrino oscillation, the detection of gravitational waves (and the nuclear physics revolution that has begun with GW170817), SN1987A, and so much more.
The coming decades are poised to learn so much more, a lot of it from the stars. GAIA, LISA, updated terrestrial GW detectors, LSST/Rubin, TMT, SKA, and more are all poised to tell us much more about things we don't understand. Particle physics will move forward too, though it is uncertain how quickly. The right breakthrough in wakefield accelerators, though, could be transformative.
Thirty spokes share the wheel's hub;
It is the center hole that makes it useful.
Shape clay into a vessel;
It is the space within that makes it useful.
Cut doors and windows for a room;
It is the holes which make it useful.
Therefore profit comes from what is there;
Usefulness from what is not there.
Tao Te Ching - Lao Tzu - chapter 11
I think, we are getting a peak at the politics of the practice of science when you look at the article and comments, like this ones parent.
A bunch of this is about the unobserved. The theoretical rather than then experimentally observed (think scientific method).
Whose theories get the funding to be looked into? Whose ideas are published and talked about in the popular places?
A post of Sabine's talked about how she thinks the crisis in physics isn't about physics .
Is it instead about the politics of the money and popularity? Is it about the psychology of being wrong? I mean, different physicists contradict each other and the truth isn't what's popular it is what's observed, right? If people contradict each other they can't all be right.
What's most interesting, to me, is how this isn't about what's been observed but all those human system qualities that people bring to the table.
But instead, the way it works is a consensus is built. Researcher Bob - "We see that A=Foo." Researcher Sally -"Well, I see that A=Bar." Researcher Timmy - "Well, I see that A=Far." Researcher Kimmy - "Well, I see it as A=Fbar." Community over time - "Now the community has seen that A=Fbar is consistently correct and can be relied upon and used." 20 years pass... Researcher Jeff - "Well, I see that Ab=Fbar actually." Community - "That's bullshit." Researcher Betty - "Well, actually I see that too. But I see Ab=FbarC". etc etc
Since we don't know what we don't know, it isn't really a "We're done!" situation for someone to get 100% correct, it's a process of evolution in knowledge.
I think part of this is that scientists don't "see" as in observe these things. Instead they "think that" something is the case. And, they have math and ideas to back that up.
If we had repeatable observation of the things it would be much harder to make disagreeing arguments.
If you see a shimmer in a desert, does that mean there's an oasis?
A good book (or two) for this is Isabel Stenger's Cosmopolitics 1 and 2.
She goes into a lot of physics debates from the 20th century, along with a lot of the political debates
To add to your point, scientists have been chasing inconsistencies exactly like Sabine said. They’re chasing dark matter, dark energy, quantum gravity, early universe cosmology, naturalness (arguable; more of a theoretical inconsistency), and quantum foundations (smaller effort).
So I don’t see the point of vague allusions to the philosophy of science. If anyone has a concrete proposal---inspired from philosophy or whatever else, doesn’t matter---the proposal will typically be treated on its merits. Modulo caveats about humans being humans and all that.
Experimentally, people continue to hammer away at the Standard Model (and gravity, my specialty), and the paradigm continues to hold. With few exceptions, most ideas from 40 years ago have been put to significant tests and turned out not to be how Nature operates.
From a theory standpoint, without new guidance from us, theorists are forced to attempt to out-think Nature, which is extremely hard to do. Smolin may be bummed, but theorists can take solace in the fact that the problem is extremely difficult.
In the particular case of quantum-mechanics, the day that theorists divine a compelling way that one interpretation of QM makes a prediction that differs from that of another interpretation is the day that an experimentalist starts building a test to find out which one is true.
In a way it is a meta-science because you need to find a methodology to evaluate which theory, when proven or disproven has the most impact.
I would imagine that the current working method would be to look at how often theoretical physicists cite a specific theoretical result/paper and hence "popularity" is the prioritization mechanism.
If a large cadre of highly educated people find a particular theoretical mechanism sufficiently compelling to dedicate their careers and time to it, then surely that's a good basis on which to develop experiments to try and confirm whether it's right? Because the benefit of proving it wrong means they'll reallocate their resources.
Presumably the economical cost of the experiments are higher at this point then the resource-allocation of that cadre of highly educated people.
It's worth thinking about, but not sure if there is a viable alternative methodology to be found
Thirty spokes, there is a hub. When it is free, it is used for cars. I think it is a device, when it is free, it is used for devices. Profit, useless.
Proof of mine: https://i.imgur.com/hb428NI.png
 DECEARING EGG comes to mind. https://www.youtube.com/watch?v=3-rfBsWmo0M
Edit: As someone else has now mentioned, it's the difference between leaving the quotes(「 and 」) on the ends or not. So it is just the translate AI being weird as usual.
"Thirty spokes, there is a hub. When it is free, it is used for cars. I think it is a device, when it is free, it is used for devices. Profit, useless. "
I feel this answer is a bit of a self justifying cop-out. A lot of the value of physics from the perspective of society has been generating understanding about the world that actually translates into manipulating the world.
Reaching the moon, superfast internet, gps, microwaves, etc... etc..
You are not wrong of course, but knowledge for the sake of knowledge is not always useful, and also not always worth funding imo.
Because that's how I feel. There are so many things I just don't understand now like:
1. I originally thought the Heisenberg Uncertainty principle was a natural consequence of using particles (photos) for measurement. Instead however it seems to be a fundamental property of the universe, which I only learned after finding out most of the mass of hadrons comes from the relativistic motion of quarks and it explains why hadrons don't collapse to a point.
2. What does it even mean to create more space? The universe is expanding. Ok, I can accept that. But what does it mean?
3. I find the models for dark matter and dark energy to be... unsatisfying. I realize there's experimental evidence for unobservable mass but it feels like a fudge.
4. Of course we still have no quantum model for gravity.
5. I don't really understand what a fundamental force really is. Like why does electromagnetism have a repulsive opposite but gravity doesn't? When I tried to look into this I ended up down some rabbit hole of "gauge forces" and got completely lost. Why is the Higgs Field not a force?
6. Why are some predictions of the Standard Model so incredibly accurate (like the magnetic moment of an electron IIRC?) while others are so incredibly inaccurate (eg IIRC the QFT prediction of vacuum energy is off by 120 orders of magnitude).
7. Why are there exactly three generations of particles (ignoring the Higgs)? What does a generation even mean?
I could go on. I don't for a second mean to suggest any of these notions are wrong. It's just that the models have gotten so complex (it seems?) that it just feels like something huge is missing, something that will eventually seem obvious in hindsight. Or am I just a lemur trying to figure out how an airplane works?
Unfortunately yes, all of us are.
This is not a problem with physics or abstractions. It's a problem with our intuition. Our intuition is based on evolution and life experience, which is all formed based on mostly solid objects from 1cm to 100m moving at 1m/s to 100m/s.
The universe however does not care what meatbags can experience with our senses. It doesn't mean that our complex math abstractions are necessarily correct - but they are more correct than casual intuition. You can train that intuition with enough work with the maths.
For example, you can map most of basic electricity to water flow and pressure, and some electromagnetic waves to waves in water - but you need to make a small jump to abstraction to combine both, and a large jump to get a gut feeling for special relativity to "feel right".
The crazy part is really that mathematical abstractions exist for all these things at all. There seems to be no natural reason that physics should be describable by small elegant formulas at all, let alone our experience of throwing rocks into a pond. Why isn't particle physics as messy as organic chemistry?
It is not nature that describes itself in math, it's people who describe in math what their current understanding is of how nature works. The more mathematical simplicity in the formula, the better we understand it.
Organic chemistry is harder because our powers to observe it are computationally and experimentally limited at this point.
Human intuition is based on limited sensors, boundaries which we have overcome with science and applied science over time. Our intuition had to be collectively replaced with rigorous mathematical methodology to incorporate these foreign sensors.
Second, modern 20th century physics education (courses, textbooks) suffered sustained corruption of methodology by scientific authorities, where the quest for understanding was renounced in favor of "modelling" and "prediction" (e.g. authors of orthodox quantum theory and their less bright pupils perpetuating that attitude) and later by institutionalized system of university research which propels tweaking and applying old ideas to detriment of trying new ones or questioning past ideas that are too ingrained.
This leads to a large portion of theoretical physics publications being more and more about complex calculations where most applicators do not even try to understand "what is going on", they just assume the same quantum methodology with some tweaks (i.e. different configuration spaces, more dimensions, different Lagrangians, new fields that fix problems of the previous ones, tricks with removing some ugly series terms etc).
Sometimes these tweaks get fancy names (superstrings, loops, dark matter) but they are really an additional concept that needs to be put in to save the edifice from those radicals who would like to try actually new and incompatible ideas.
When you study 20th century physics yourself from original sources, you'll find the stuff taught currently actually has highly varying degree of credibility. Some stuff is rock solid, such as relativity, molecular theory and chemistry, nuclear physics and solid state theory, and some stuff is ... well, more unfinished and less credible - such as standard model, force unification, quantum gravity, dark matter, etc.).
If you want to get some solid ground on which to build intuition, start with the rock-solid physics as known till 1905, then after that makes sense, learn about its problems (explanation of emission spectra, inconsistency of EM theory with Newtonian mechanics), then after that take a deep breath and read original papers on quantum theory and particle/nuclear physics.
This will take years to understand. The later theoretical stuff around Standard Model details (lepton generations, stability of particles, unification of gravity and QFT) is a decades old project that nobody knows how to finish. It is stuck for now, and has little relevance for understanding those previous things.
The folks who just added one more term to the old stuff to make it predict better - let's call them the old guard necessarily did it to point out that old models can become slightly new ones too. Even if in the long run they will be seen as the evil holdouts.
Yet at the same time we know that just whipping up a new fancy maths model won't solve anything in itself. New models need to make new testable predictions.
And then even new models require lengthy fine tuning, which requires costly experiments.
Alas textbooks are very often terrible, but not because they emphasize predictions and models over "understanding" - but usually because they omit to elaborate on how to select the better of two models, how paradigm shifts happen, how anomalies are ever present - and thus make practical model selection even harder. Plus they regularly fuck up the math explanation part, exactly because they use terrible language and models.
Finally, it's always data that cleans up the mess. Either practical usefulness - engineering, applied science. Quantum experiments, q-bits, and so on. And on the high-energy end cosmology and astronomy.
Anyone harking about how the crisis is about politics usually wants to allocate more money to theorists, so we will finally get breakthrough theories. Yeah, great, we already have a lot of those, but without data we don't know which one to take seriously.
Furthermore pouring money into theory is a nice idea, and comparatively cheap (compared to a new collider), but that won't solve the very pragmatic employment question for the collider builders. (Who are out of luck anyway, because the era of building bigger underground circles seems to be over. But they would gladly build anything, but they won't make good theorists, even if pundits' articles seem to imply there's a simple slider between theory and experimentation.)
Once you get into invisible forces acting across very small distances, nothing about how the world works is 'intuitive'. The most precise description of it is... A bunch of math, and not the kind of math that people learn in their K-12.
 As has entropy, and the laws of thermodynamics. Yet even educated people often have no idea of what the laws of thermodynamics actually imply! You'd figure that people would take a little bit of an interest in them, given that they live in a society powered by combustion engines...
Hearing casual observers talking about quarks or dark matter just makes me run the other way. I know I don't know.
Consider several droplets of water on the surface of a latex-based birthday balloon. The balloon is inflated, and droplets spread further apart, yet each droplet retains its shape and size. If you measure the distance in droplets it appears that you have created space as more new droplets could now fit in between those placed earlier.
We only have two comprehension tools to help in grasping the universe - math and intuition. The above is intuition, and you know better than me where to find math. There isn't more "meaning" to it than that, just the tools.
Space everywhere is expanding, it's just all the other forces over short distances ensure that everything pulls itself back together. The Big Rip - if dark energy is increasing in strength - is what happens when at some point the rate of expansion exceeds the forces at various scales that are able to hold things together, till even the strong nuclear force can't sustain it.
Just the other day right here on HN was a link posted about dark energy, and how maybe it's just measurement calibration error. Poof, solved. Or not, we shall see.
Reading about the statistics and epistemology behind the experiments (Andrew Gelman's blog) and models (eg why preferably hierarchical Bayesian models) will help a lot to cut through the problem of understanding and satisfaction. (Rarely we can have both for complex issues.)
Oh, and everything is a field. Fields are coupled (coupling constants, running couplings). Some fields' have non-null base energy state. (So vacuum energy is non-zero. Maybe. It depends how many and which fields your model deals with.)
Some kind of energy wiggle in some fields translates to one/two/a-lot of wiggles in other/the-same fields. (See the Feynman diagrams, how there are infinite number of possible but decreasingly probable interactions between "particles".)
What is a force? It's just one well separated aspect of the whole model. Ultimately we think all of them are coupled into one big interaction that involves every field. See the electro-weak unification.
Why gravity doesn't seem to be able to attract? Because maybe it's not a field, it's just the shape of spacetime warped by energy and we haven't found negative energy. (Einstein's General Relativity) Or it's just based on entropy and thus it's a very strange emergent property of our universe. ( https://en.m.wikipedia.org/wiki/Entropic_gravity - bonus, it explains dark matter too, so maybe it's simpler - but it's just the ugly MOND (modified Newtonian) in disguise, noooo!) Okay, so maybe back to higher dimensions and branes and loops and strings? But nobody understands that! So we wait.
Why three? Because so far we found three and thus our models reproduce exactly that many.
That's how I feel when using Twitter Bootstrap compared to the WYSIWYG days of VB-classic and Delphi. Like Quantum Physics, getting Bootstrap right relies on probability and killing of cats, or at least hair follicles. (Our shop probably needs a dedicated UI coder, but office politics won't allow it.)
> 2. What does it even mean to create more space? The universe is expanding. Ok, I can accept that. But what does it mean?
IME both QM and Relativity are much more easily understood if you start from the actual equations and/or an undergrad textbook. I don't think physics is - yet - beyond the technical and scientific minded amateur who is willing to read and understand some equations. But it's probably gone beyond the ability of any science journalist to render into prose.
> 3. I find the models for dark matter and dark energy to be... unsatisfying. I realize there's experimental evidence for unobservable mass but it feels like a fudge.
I think working scientists would agree with you at least as far as dark energy goes. Dark matter is pretty indisputably just some kind of matter that we can't see (see e.g. the Bullet Cluster) - there's still something to be solved in terms of figuring out what it actually is, but I don't expect that to be major new physics. But as to dark energy: yeah, it's a fudge. Everyone knows it's a fudge. But it's still our best description of reality. Even if you knew there was something wrong with epicycles, they were still the best way to calculate planetary orbits at the time.
> I could go on. I don't for a second mean to suggest any of these notions are wrong. It's just that the models have gotten so complex (it seems?) that it just feels like something huge is missing, something that will eventually seem obvious in hindsight. Or am I just a lemur trying to figure out how an airplane works?
I think models aren't complex so much as unfamiliar. We're getting further and further away from everyday experience, and so more and more of the fundamentals of reality have to be understood from mathematical first principles rather than everyday intuition.
The models are actually really good though. Certainly once I understood QM it seemed so clear and simple that it couldn't possibly not be true.
How much of that is an insistence on thinking of things in terms of fundamental particles and definitive properties instead of fields and packets of energy? I guess the question is that what the mathematical models are really about, and whether we're just having a hard time shrugging off Greek atomism and 19th century materialism when it comes to intuition?
Intuition could be built on top of a solid understanding of fields, backed by the math. The difficult part is connecting that to the classical world of our size that we experience.
I don't think it's "Greek atomism and 19th century materialism" to imagine a world made of persistent physical objects in well-defined positions. That's the entirety of everyday life.
Those views were prominent among certain intellectuals at different times which go against the grain of everyday experience. Since physicists are trying to understand the fundamental nature of the world, I take it they're used to not accepting appearance as a guide, and are rather operating under whatever philosophical intuition is dominant at the time. Since 19th century physics confirmed the existence of atoms, and the periodic able of ordinary matter is made up of atomic bonds, then it makes since that physicists have been influenced by that guiding intuition.
Regardless, fields are probably a better building block for intuition than particles.
1. I originally thought the Heisenberg Uncertainty principle was a natural consequence of using particles (photos) for measurement.
That's the observer effect. When quantum mechanics was first constructed, it was believed that the uncertainty principle could be explained that way. As you mentioned, today, it is considered a more fundamental feature of the theory. As an analogy, consider how a signal sharply located in the time domain gets smeared out in the frequency domain, and vice versa. It's kind of like that.
Specific volume (the average volume occupied by a unit of mass) increases. If the universe was spatially finite, so would total volume.
Shrugs. If dark matter turns out to in fact be just a bunch of yet to discovered particles, that wouldn't be too strange. Dark energy is more of an unknown.
That's an issue.
5. I don't really understand what a fundamental force really is. Like why does electromagnetism have a repulsive opposite but gravity doesn't? When I tried to look into this I ended up down some rabbit hole of "gauge forces" and got completely lost.
For now, there's a certain amount of arbitrariness involved: We could easily construct universes that worked differently. Furthermore, gravity is special: At the classical level, it's not a regular force, but a pseudo-force (a consequence of Newton's first law). Also note that it can in fact manifest repulsively (eg via the cosmological constant). From the perspective of particle physics, it's also special because the hypothesized force carrier (the graviton) would have spin 2 instead of 1.
Why is the Higgs Field not a force?
To my knowledge, there should be an effect one could call Higgs force. It would be tiny.
Quantum Electrodynamics can be solved perturbatively. We know how to do that. Things that can't be handled that way tend to be hard.
Regarding vacuum energy, you'd have to account for every fundamental field there is to get it right, so no surprise we get it wrong. Also, given that we have no quantum theory of gravity, it's not even clear to me that it is even the right approach.
7. Why are there exactly three generations of particles (ignoring the Higgs)?
Figure that out, get a Nobel prize. It could even be arbitrary, like how it's futile to ask why a particular snowflake looks different than another snowflake created in similar conditions: Random chance.
Hope that helps a bit.
Why shouldn't it be complex? The universe is complex. It's amazing that it's so simple, really.
2. again pop-sci books.
3. You and anyone not blinded by non-scientific factors...
6. The math we use for those calculations is adhoc bullshit and no one has been able to place it in a solid foundation.
that begins with: "Philosophy isn’t useful for practicing physicists. On that, I am with Steven Weinberg and Lawrence Krauss who have expressed similar opinions."
Though to be fair, she clarifies that she wishes philosophy wasn't so useless, and that:
"Philosophers in that area are necessarily ahead of scientists. But they also never get the credit for actually answering a question, because for that they’ll first have to hand it over to scientists. Like a psychologist, thus, the philosopher of physics succeeds by eventually making themselves superfluous. It seems a thankless job. There’s a reason I preferred studying physics instead.
Many of the “bad philosophers” are those who aren’t quick enough to notice that a question they are thinking about has been taken over by scientists. That this failure to notice can evidently persist, in some cases, for decades is another institutionalized problem that originates in the lack of communication between both fields."
This is the sort of reasoning that got me reading Hossenfelder in the first place, not the conspiratorial posts she writes now... :(
Yes, my doctor too has a thankless job of treating disease & injury in me, where once he succeeds, he has no purpose in life.
A better analogy is "once my doctor successfully advocates for better health habits and I achieve them, they are purposeless."
I think it's also possible to learn without having strongly held beliefs about the subject beforehand. This alternative reminds me of Descartes (don't assimilate knowledge until you are sure of it's veracity) and Bayes (keep track of degree of belief about traditionally non-probabilistic things). Maybe such an approach would help getting trapped in local optima. E.g., I'd imagine it would be hard to climb out of the theist energy well once your world view were based on it.
It looks more like Hossenfelder found something discarded in the shed and temporarily use it as a club in want of something better. A slight intellectual dsihonesty.
The philosophy of science is really a meta field for science while philosophy itself encompasses things the philosophy of religion and other things core to the human experience.
When studying something of scientific nature the human experience like religion, art or that kind of thing is the domain of the humanities and entertainment and not of the universe in general.
Theoretical physicists' way of working is to put forward baseless mathematical models and build $40bn machines to prove them wrong. They should instead work on theoretical inconsistencies that have been known for a while.
So the alternative is to build a $40bn machine, look at the data and then try to imagine then how to fix the theoretical model. [There is a risk of overfitting the model, and finding patterns in the noise.]
Another alternative is to build a $40000bn (or more) machine and have enough precision to make the model obvious from the data. [I'm not sure this is possible, I guess with enough money, perhaps m000000re money.]
This article seems like a decent introduction:
I have yet to see the "expected" galactic rotation curves that are contradicted by observation a lead to ideas about dark MATTER. I'm mean I've seen the curves but cant find the math behind them. You often see weak references to Keplers law which doesnt even apply, so that leaves me very skeptical.
MOND is an attempt to make dark matter an unnecessary assumption.
No opinion on whether some MOND will turn out to work better than dark matter.
This is directly derived from 3rd law
Then it's wrong. I will need to see the derivation to find the error. I've seen indications of a couple possible places it may be (based on simplifying assumptions people make incorrectly) but have not seen the actual derivation of the expected curve.
We called the dominating one for our current instance/state of the universe "matter":
It may be the cause, but it is not known to be the cause.
And it's not a random walk; black holes accumulate charge from pairs, making future radiation not symmetric.
My understanding is that people have probed this for some time and it's still inconclusive if it can generate the observed imbalance. Here's  a 1979 paper on the idea, with hundreds of citations, in case you want to poke at the literature.
If I remember, Cosmology by Weinberg has a chapter on various theories of how the imbalance may happen, none of them known to be all of or even part of the answer.
I'd argue back to the author that "putting forward 'baseless' mathematical models" is "working on theoretical inconsistencies." When you're testing a black box for the content of the box without the ability to open the box, one may have little basis for an idea that might, maybe, could possibly provide useful results. That testing will definitely provide information, even if it's the basis of ruling out an entire class of tests.
It's not wrong. It's just not changing over time. It is stagnant.
The nice thing about physics is that with new advances in astronomy and the lack of a unified theory, it keeps getting poked with reminders that there may be missing pieces. That's not true in many other fields.
Don't you rather believe that a much simpler explanation is that the incentives and terms for grants are at fault?
Once again, as an outsider it doesn't seem to me that they are anywhere near "done", but they sure as heck look like a mature science. Physics and its children have given us amazing things. Spending a lot of time playing with math wasn't one of them. All sciences have one aspect in common: until you get to reproducibility, the conversation in the community tends towards groupthink over time. That's a human characteristic not related to any one field of study.
The End of (one type of) Physics, and the Rise of the Machines
Seems there are two possible outcomes. The deluge of data leads to better correlation which smooths over the flaws in current models. And corrects errors with some minor fudge factor that contains no further significance.
From Dark Matter to Galaxies with Convolutional Networks
Or something deeply profound is discovered. The thing which cannot be ignored. And instead leads to an explosion of new physics. Recognizing patterns of the latter class will perhaps always be the domain of the human operator.
Once more people accept the concepts of modern physics as a way of life (perhaps intuitively?), we will be in fertile territory for any potential new revolution in physics.
For quantum mechanics you have to know eigenvalues and eigenvectors. This is studies in the first years of the university in a technical career. I'm not sure if it can be teach much earlier.
For Special Relativity you have to know Minkowsky spaces. It's not so difficult, it can be moved to the first years of the university.
For General Relativity you have to know curved spaces. It's not imposible to learn, but you can get a Ph.D. in Math or Physics without studding curved spaces.
Re quantum mechanics without many prerequisites, I'm a fan of Feynman's book QED.
A kind of Pilot Wave can explain quantum weirdness to layman people with ease.
We can ditch theory relativity and calculate speeds relatively to CMB, which is much easier to understand.
We can ditch Big Bang theory and, instead, accept that light is not immortal, because it ages with time. IMHO, Dipole Repeller and Shapley Attractor are much more attractive and easier to explain than Big Bang.
The problem with current theories is that I understand them when I reading them. It's like piece of complex code or book with complex but boring text, like phonebook. I can follow it, when I read it, but I cannot reproduce it when book is closed.
Can we teach a phonebook to kids? Yep. Is it useful? Nope.
Recently, I did "quantum physics in one picture" experiment. Results are very good: lots of reposts, comments, interest in topic.
In short, pilot waves were a worthwhile avenue of research, but we have seen they are incredibly cumbersome or even insufficient in many quantum mechanics problems.
Entanglement is hard problem for PWT. Photos of entangled photons are intriguing, because they look similar to behavior of walking droplets in some experiments (see dotwave.org feed). I hope, someone will be able to reproduce entanglement in macro. Currently, my top priority is to reproduce Stern–Gerlach experiment in macro (I suspect that interference between external field and particle wave creates channel, which guides particle into spot, but it better to see it once). Second priority is creation of "photons" in macro. Entanglement will be third. IMHO, all of them require microgravity to reproduce in 3D.
That is not true, geometric algebra is an example of a recent pedagogic improvement that is getting a lot of attention. The problem is that physics will never be easy enough for someone who is not prepared to think deeply, because it is one of the few areas where truly new ideas can be found. Virtually every area of learning involves repackaging concepts we have all known from childhood (people's motivations, stories, colors, that kind of thing) in specific ways. Major exceptions are physical tasks like learning to sew or play an insturment, and "esoteric" subjects like math and physics. In all of those cases you cannot learn by casually reading because the neurons in your brain are simply not prepared for it.
Not really. People look at it, marvel, and move on.
I've been interested in GA for years now because it helps me visualise and understand otherwise inscrutable mathematics.
Nobody, literally nobody mired in the traditional mathematics of theoretical physics can explain why the Universe is best represented using matrices of complex numbers with constraints on them.
"Shut up and calculate" or some variant is the common response to such probing questions.
More often, it's some variant of "Well, I can understand it, you need to study more.". This is usually stated just politely enough not to be outright insulting. But if you keep asking probing questions, it turns out that they don't really understand either, the "study" didn't help them either. They only got better at pushing the symbols around on paper They're dismissive of such questions because they're too proud to admit their own ignorance.
Geometric Algebra (GA) was my "lightbulb" moment where I finally understood where Dirac matrices, Pauli matrices, and the like come from and why they have the structure that they do.
My logical conclusion was that GA is the far more elegant, clear, understandable mathematical structure that brings a wide range of Physical phenomena under a unified formulation. So clearly, it should be used for pedagogy.
Nobody agrees with that. The attitude is "well, that's nice, but it's mathematically equivalent so there's no benefit." which is just the stupidest thing I've ever heard.
Imagine if you saw a function called "add_num(a,b)" that computed the sum of two integers using the full bit-by-bit adder digital logic circuit simulated in software using boolean logic. Absolutely bonkers, insane code, right? Clearly this ought to be scrubbed from the codebase and replaced with a simple "+" operator, because we're not maniacs. Physicists would argue "no", it's equivalent, it's "working", so shut up, leave it and just move on.
Drives me batty.
- we're exiting the "industrial" mindset where everyone is the same making the same products, to a wider topology of knowledge and skills (more and wider horizontals, more and bigger verticals, 'average' profiles become 'scattered'). This clearly drives a need to "learn a little bit of a lot of things" even at expert level.
- The walls and denial you expose here is to me but a symptom of the disease that current academia will either have to heal or die of. Seeing how Khan (and thousands of Udemy's after them, indies) changed the landscape, my money is on a major paradigm shift incoming for academia (it's already done, they just don't seem to know it yet as institutions, most of them). Lots and lots of great teachers around the world almost freely sharing incredible hands-on knowledge and insight.
- Some applied domains with dramatic tension of the demand side (lots of positions to fill) don't have the luxury of elitism and massively adopt "pragmatic" approaches especially in learning. Software dev, programming and tech in general is much like that — the "one liner" installs and 1-page "getting started", all the intelligence solely put into making things intelligible and usable is, frankly, quite humbling and inspiring in that field. A very good side of the SV/Cali culture. So, examples of how to proceed next really do exist.
Now when I think back of topics that I hurt my head against for months or years, that a simple 20-minute video could 'unlock'... Why, why do we not make it a staple of "teaching" to at least consider 2-3 angles to make sure everyone's got a fair chance at getting at least 1?
- On the topic of hubris and laziness, this is where physics went astray, imho. Too much hubris and not enough laziness. That was back in the 1980s and it took 40 years to realize, probably 10-20 more to "fix", if ever before we build a new system (see above).
That being said,
> Geometric Algebra (GA) was my "lightbulb" moment where I finally understood where Dirac matrices, Pauli matrices, and the like come from and why they have the structure that they do.
YES, please! Geometric algebra seems like the thing that could blow my mind too. I am very visual, to a fault maybe.
Would you have a 'favorite' resource to share? (book, course, youtube, whatever?)
The reason Physics "went wrong" is that in 3D space (only!) the mathematics of areas and vectors is coincidentally isomorphic, so it's possible to cheat and use only vectors and scalars and then everything "works". Similarly, volumes and scalars are easily confused as well, and appear to work fine.
GA has no such restrictions and the same formulas work in all dimensions, including high-dimensional or with degenerate metrics. Problems from classical geometry such as finding tangent lines to circles can be trivially extended to finding tangent hyperplanes to hyperspheres, even for very complex problems.
The formalities of GA force you to include things like the square of the unit pseudoscalar in some physics formulas that were accidentally dropped in the traditional form because in 3D this is just "1" and hence easily overlooked. This makes some formulas weirdly difficult to extend to become relativistic, when in fact the problem was just the "weak typing" of vector algebra.
Vector calculus also inherently requires a basis, which is an easy way to get bogged down in the weeds and get confused by issues with the algebra itself instead of the truly "hard" aspects of the problem.
Generally, the "lightbulb" moment for me was that Geometric Algebra has various subsets that are also closed algebras in their own right. For example, the "even" subset of a 3D GA is isomorphic to Quaternions, and the even subset of a 2D GA is basically the same thing as a Complex number. The various "named matrices" are just other subsets of 3D or 4D GAs. Physicists tend to avoid the full general case and simplify their algebras down to the special subset cases, using the historical names and greek symbols. We have to keep the symbols, you see, because otherwise you wouldn't be able to read 2000-year-old ancient greek texts, or... something.
University Physics is actually a study of the History of Physical Philosophy. The computer science equivalent would be learning about abacuses for the entire first semester, then progressing to mechanical calculators in the second semester, vacuum tubes in the second year, and so forth, only to briefly touch on transistors by the end of the third year. Postgraduate research students would be finally told about modern silicon chips and software development, but by this point they're so used to wiring up breadboards manually that it's too late to teach them how to do anything properly.
Starting with something elegant like pure functional programming in the first year is how I studied Computer Science, but I only found out about Geometric Algebra existing after I graduated Physics. It's nuts.
For background reading:
David Hestenes is one of the few physicists trying to reformulate physics in terms of GA: http://geocalc.clas.asu.edu/html/Overview.html
There's lots of papers around: http://geocalc.clas.asu.edu/html/GAinQM.html
Wikipedia is an okay starting point, but not amazing: https://en.wikipedia.org/wiki/Geometric_algebra
Enkimute's "Ganja.js" online demos are amazing, unfortunately the source was written by one of those crazy maths people who think that terseness helps readability: https://enkimute.github.io/ganja.js/examples/coffeeshop.html...
Real industrial use is few and far between, but at least a few folk have discovered that GA is ideal for robotics. Unfortunately, not everyone got the message, and most robotics software libraries are firmly vector/matrix based and have all the usual issues like numerical instability and gimbal-lock. Fun stuff.
So.. I've been dabbling with GA since we talked and it is an incredible framework!! I now understand your post loud and clear. It's a new dawn of math for me, I really mean that; Clifford is my new prophet (and I think this one's a keeper possibly for life, I don't know and can't imagine something better for the problem space). So much had not clicked with linear algebra for me, so much of matrices was obscure and had no representation in my mind... And GA's base objects and concepts are so, so elegant, and exquisitely intuitive.
Looking for a short conceptual intro I stumbled upon this channel/playlist: https://www.youtube.com/playlist?list=PLpzmRsG7u_gqaTo_vEseQ...
Turns out he's an outstandingly good teacher. Strong recommend.
I'll probably take a more "serious" course/book (with problems!) next — if anyone has a recommendation, please do!
Then make progress by working on actual stuff (I guess Hestenes' reformulations are a great starting point, retracing some of these following his reasonning).
And the penultimate goal would be to reformulate stuff myself, if I could — haha, that would be so great. More realistically use GA for research in designing models and representations.
TL;DR: you brought Math back into my life. We were on a break (but kept calling each other..) for the last decade and a half. GA is really, really strong. Remind me again, why don't we teach children like that for a century? /s (sigh)
Much thanks again
I can't elaborate much, so just a few "mind blown" moments for posterity:
> GA is just "strongly typed" vector algebra.
That's one hell of $1B slogan, at least around these parts! :) Shut up and take my money.
> in 3D space (only!) the mathematics of areas and vectors is coincidentally isomorphic, so it's possible to cheat and use only vectors and scalars and then everything "works".
I never realized that... there's indeed a lot of confusion in my mind between those concepts. I fail to see how "different" they're supposed to be, I guess really need to go back to sane basic in that regard.
> Geometric Algebra has various subsets that are also closed algebras in their own right.
Just wow. I love this. I actually need this.
> GA has no such restrictions and the same formulas work in all dimensions, including high-dimensional or with degenerate metrics. Problems from classical geometry such as finding tangent lines to circles can be trivially extended to finding tangent hyperplanes to hyperspheres, even for very complex problems.
So that is the real kicker for me, because it fits my problem space so well. I'm exploring highly-dimensional models (basically letting complexity arise from the dimensionality of rather simple/elementary objects, rather than trying to shoehorn complex functions in low-dimensional space in hope of pretty much randomly finding "better fits" — it's a strong desire to not interpret the data before the fact, to remove bias from modeling itself).
There's interesting research around geometric deep learning as well, which seems largely informed by physics as well, and this is sort of the logical conclusion of that for big datasets.
I think industrial use may rise greatly based on this first take. But it's always a generational thing with culture — it takes ~25 years give or take for those who "grew up with it" to finally become the majority of the workforce and sway things their way. Same with politics — looking at you, academia. As you said, "but by this point they're so used to wiring up breadboards manually that it's too late to teach them how to do anything properly."
> It's nuts.
Yeah, it'll take time, never mind how infuriating in the meantime. But good on you, spreading the word about GA is exactly how we move forward, one post, one topic at a time. Eventually, we get there.
As long as we keep teaching the reckless hand-waving that is the Kopenhagen interpretation, we will keep confusing clear-thinking students.
If you're one of those people, I believe it's a reference to the "Three-Body Problem" series.
I think it would have been helpful for the article to put the 40 years of no progress in perspective. Are we looking for progress on the scale of the theories of relativity and quantum mechanics, and so should we be comparing to the timescales between Newton and Einstein/Schrodinger? How should we think about the rate of progression in a ‘mature’ field such as physics? Should it be linear (big discovery every 40 years), faster (new discoveries are faster due to bootstrapping from other discoveries), or slower (diminishing returns)?
We believe, as an assumption (or nearly as a matter of orthodoxy) that there are simple universal laws that govern consistent natural phenomena. One could argue that that is the foundation of our science of physics in that if that wrong, the whole thing falls down. But that has not “progressed” and really should not change... which seems consistent with the concept of a building foundation. Building foundations don’t move and shouldn’t move.
What about theories, which seem to be the focus of her blog post? Well we should be careful to distinguish between our theories and the fundamental laws we think they describe—the map vs the territory and all that. I would really hesitate to call our theories a foundation of physics. For one thing they are known to be provisional; intended to be changeable. That’s not how foundations usually work.
When observations contradict theories, the theories must move. From that perspective one could say that observations are more foundational than theories. Once a piece of evidence is properly observed, it doesn’t change.
And the thing is, we have collected major (I would argue foundational) observations in the last 40 years. We observed the Higgs boson and gravitational waves, and I would call both of those foundational.
That they agreed with existing theory is somehow being taken for a crisis? I guess it’s a crisis if your job is to come up with new theories and you’re lacking reasons to do so.
But there are plenty of mysterious observations yet to be explained. Many of the observations related to dark matter and dark energy fit within a retrospective 40-year time horizon. Call them astronomy if you like, but going back up to my second paragraph, we believe they should be explainable by our physical theories.
> 1. that there is an objective reality shared by all rational observers.
> 2. that this objective reality is governed by natural laws;
> 3. that reality can be discovered by means of systematic observation and experimentation.
> 4. that Nature has uniformity of laws and most if not all things in nature must have at least a natural cause.
> 5. that experimental procedures will be done satisfactorily without any deliberate or unintentional mistakes that will influence the results.
> 6. that experimenters won't be significantly biased by their presumptions.
> 7. that random sampling is representative of the entire population.
Some other approaches:
Basically you have to make some metaphysical assumptions before doing science can even get off the ground. If you believe that reality is an illusion (Buddhism? Hinduism?) then you're less likely to be interested in understanding the world's workings. If you think that things occur for capricious reasons (e.g., pagan gods being the cause of things), then there is no reason to ask "why?". If things happen not because of inherit properties but because of God's Will (Occasionalism), then who can understand the mind of God?
I've heard it argued that science mostly developed in (Western) Christendom because it brought together all of the above assumptions under its Aristotelian world view. If you look at the invention of the telescope in ~1600: it spread over the world with-in a couple of decades, but most cultures weren't really interested in it.
I find it an interesting line of reasoning that the current lack of progress is due to the default naturalistic approach whose sole purpose is finding "truth" vs. a more pragmatic, non-realist approach that would have a much more concrete purpose (e.g. solving particular problems). Truth for the sake of it with no practical experiments seems to have been a dead end.
Also, you really wrote a bit too much.
This intellectual current is now upheld by the engineering sciences. The physicists are too glued to the Bohr Model and a particle universe to concern themselves with a new mechanics in light of the quantum wave phenomena discovered last century.
"But for all I can tell at this moment in history I am the only physicist who has at least come up with an idea for what to do."
make it hard for me to share her point of view, especially as these ideas are not mentioned.
It is also not true that no progress has been made in the understanding of String Theory in the last 40 years and it still seems like the best bet that could eventually generate a fundamental theory. What is missing is still a lot though:
- We don't seem to possess the correct mathematics to develop a non-perturbative formulation of String Theory and there are too many potential string backgrounds that we could expand around.
- It is also hard to derive the matter content of low energy effective actions from most brane configurations.
String theory provided major insight into non-perturbative quantum field theory as well. There is tons of examples, let me highlight one of them: The discovery of the Amplituhedron (Arkani-Hamed et al., 2012) was preceded by the discovery of the BCFW recursion relation (Britto et al., 2005), which in turn was motivated a relationship between perturbative Yang-Mills theory and the instanton expansion of a certain string theory in twistor space (Witten, 2003).
No, it doesn't. The original Standard Model from the 1970s did, but then neutrino masses were discovered and the Standard Model was modified to include them.
I have no firmly held answer to the question of whether it is.
”I have said many times that looking at the history of physics teaches us that resolving inconsistencies has been a reliable path to breakthroughs, so that’s what we should focus on."
And given that she has written at least one book on this general subject, I would guess that idea is elaborated in much greater detail there or on her blog.
Take for example the idea that a photon is both a particle and a wave. This dogma has been force fed to students for decades without the glaring inconsistency being resolved, or even pointed to as a thing that needs resolving — something is either a particle or it’s a wave; if we observe properties of both, then we need a physical explanation for how one becomes the other. Something can’t be two distinct things. Logically, all I’m pointing out is that “A is A”.
Yet my perspective that there is an unsolved inconsistency here is considered heretical. “Shut up and calculate” is the reigning dogma.
But physicists have given up on figuring out what it IS. They have decided that having math that can predict the outcome of experiments is enough. They're not wrong, but it feels rather unsatisfying. IMHO there have been a couple avenues worth exploring that are being largely ignored.
It depends on what you mean by that. What they've given up on is trying to find some sort of anthropocentric analogy. It's understandable that this is unsatisfying, since analogies are fundamental to how we understand things. Unfortunately, there's no reason to believe that a good analogy will exist.
That's no reason not to try. If there IS an objective reality I think it deserves a better description than just the math which characterizes its behavior.
Why should we assume that the contradictions in contemporary theories are of this special, inexplicable type? Every era has contradictions, before they are resolved... But they'll only ever be resolved if people are trying to resolve them, meaning that they haven't resigned themselves to the idea that our minds haven't been gifted with the capacity to make sense of our experience.
One of my favorite books is called "Architecture of Matter", it's a history of ideas about matter. One early idea was that matter is made of little tiny bits of stuff and that qualities of these little bits (such as being smooth or spikey) leads to macroscopic phenomenon (like spikey bits being acidic.)
The problem with this idea (and almost all others) is that it's just pushing the problem down a level: If matter is made out of little bits of matter, what are the little bits of matter made of?
FWIW, the wave-particle duality gets around this self-reflexive problem. Matter is made out of some other kind of stuff. But then, as you say, we still have the essential problem of "wtf is this stuff?" but we don't worry about that so much as long as our math describes the behavior of the stuff.
> They're not wrong, but it feels rather unsatisfying. IMHO there have been a couple avenues worth exploring that are being largely ignored.
What avenues? (Genuinely curious, not trolling.) It seems to me that the ultimate, existential question of what "stuff" actually is, is unanswerable (within the logical/scientific framework.)
Pilot wave theory is one. The other was a paper - forgive this explanation - the found equivalence between particle physics and fluid dynamics. Suggesting the objects in physics might be modeled as say vorticies in some kind of fluid (aether). The equivalence was IIRC only to first order, but the work to get there must have been a lot. I'm sure there are others.
And the criticism of mainstream thought in physics would be: its wrongfully dismissive of my question, and ignoring the important job of looking for the answer.
Electromagnetic filed is mathematical model of what?
Let's talk about nature of EM force.
Analogy: look at a typical tropical cyclone. It rotates. Is it rotating because a unknown property of an air molecule? No. It rotates because our planet rotates, while air molecules are just trying to keep their positions. I.e. it's rotation of planet + inertia of molecules.
Is it possible that EM force is happens because our local space is moving trough global space by non-linear trajectory, so it just non-linear trajectory of local space + inertia of rotating and vibrating particles?
Physical things are real. Mathematical abstractions are not. OpenGL is not real too, but it looks very real and accurately predicts reality. In OpenGL, field is array, e.g. "float field;".
Better examples would be the inconsistencies between general relativity, which demands curved spacetime, and quantum mechanics, which prohibits curved space. This is a very real inconsistency at the heart of modern physics. Many, many people are actively working on solutions for it. Search for Grand Unified Theory or Theory of Everything for more information. Candidate theories include string theory and its derivatives, loop quantum gravity, etc. There are plenty more.
The problem is that designing experiments for these theories is Hard. Big-O Capital H Hard. My flight's about to board, maybe I can expand on it during my layover.
Dark matter is another example. Observations of the speeds and orbits of stars in galaxies and galaxies in galactic clusters are not consistent with our measurements of their masses. Plenty of candidates for dark matter have been and are continuing to be tested.
Candidates include MOND, which supposes that our theories of gravity need to be modified when acceleration is astronomically low. We have designed experiments to support or disprove these theories and most of the results have landed on the side of disproving them. (search for "bullet cluster" or "dark matterless galaxies")
Another candidate is MACHOs. ("MAssive Compact Halo Object") Basically the universe is teeming with small black holes, brown dwarfs, loose planets unassociated with any star, basically a lot of stuff we can't see. We have designed and executed experiments to search for these, but the results have concluded that there are insufficient such objects to explain the inconsistency.
The third candidate with traction is WIMPs, or "weakly interacting massive particle". Basically theorizing that there are other types of unobserved particles that have mass but do not interact via the electromagnetic force, which makes them very difficult to observe. There was hope that neutrinos could explain all these, especially when it was demonstrated that neutrinos have mass. However, experiments trying to bound the mass of the neutrino have shown they are not nearly massive enough to explain the observations. Experiments are ongoing to find these particles, but have yet not discovered anything we can't already explain. However, there's a problem: maybe they're just too difficult for any experiment to observe. In that case, we might be SOL.
These are just two examples. All of science are all the other examples.
The idea that this person is the only person probing inconsistencies is pure hubris. It's not a useful starting point for a conversation, unless the point of the conversation is to talk about how awesome you are and how much everyone else sucks. Which is all I got from this article.
That's not inconsistent [as you've framed it]: curved time.
In the double slit experiment, if you rotate the slit 90 degrees, the interference pattern is rotated 90 degrees in the same direction. Doesn't this prove there is no wave involved?
If you make the slits larger, the pattern changes, and then vanishes.
If you make the slits circular, the interference pattern becomes circular.
If you make the slits triangular, the inteference pattern becomes triangular itself.
All these things tell me that particles are not waves at any point, they just bump to something and take a different trajectory.
When, in the same experiment, a light detector is placed, and the interference pattern disappears, this does not mean the wave goes away and there is a collapse to a single particle; it means the detector produces particles that don't bounce off something. What if the detector is placed near the particle beam emitter? has it ever been tried? I don't know. If placing the detector near the emitter makes an interference pattern reappear, then we certainly have no collapsing of any wave.
What if we put the slits very close to the emitter? do we get the same inteference pattern? what if we put the slits further away from the emitter? does the interference pattern change? if yes, then we certainly have no wave.
And something else regarding quantum entanglement: how can we be sure that the particles are not created with their properties in such a state that they appear entangled when they are measured? why do we assume there is a communication between the particles on the fly rather than the two particles having relatable but not connected properties? we just assume that due to the other assumption that particles are waves and they collapse.
Finally, how do we know that matter attracts matter and it is not the void that pushes matter into clumps? how do we know that the actual distance between the furthest points in the universe is the same as it ever was, and simply new positions are created within the same distance? and these positions are what push matter to clump together?
I'd love to sit down with an honest physicist to research these types of questions, a physicist that cares more about answering these types of questions than hunting for grants and fearing to go against the status quo, but it seems only crackpots are willing to do that.
And while there’s no harm in pondering the philosophical origins of the scientific method while debating where to go next, we should take care not to go backwards either, as that way lies fractal navel fluff and bloody string theory.
When I lived in the Middle East, I imagined a chain of dry bars called: TAIT
(Up until about 6 years ago, the Omani and Saudi outlets would have been called TAIW)
In particular the posts where she keep highlighting the misguided fools that claim gravitational waves aren't real; she wrote a forbes contribution that made a big deal the LIGO people didn't respond to questions over facebook and saw it as suspicious, and awarded points to the GW denialists as a consequence.
But maybe it's just that I know have a better understanding of what Hossenfelder is writing about, and can see for myself how thin the cases can be.
Anyway most events are BHBH mergers with no EM counterparts, so non-detection of those mean nothing.
Have we really measured gravitational waves?
If you belive that clickbait I feel bad for you, but it’s too low quality to spend time debunking.
While I think a person would also be foolish to disbelieve gravitational waves, I think it wouldn’t be quite as foolish, because while we clearly measure gravitational waves, we don’t exactly use them to do (as opposed to “look at”) stuff, so a person’s everyday life is less impacted by disbelieving GW than by disbelieving electricity.
How foolish of Newton!
And at first I had written “probably foolish” as a hedge, but thought “oh, I hedge the things I say too much, and it makes what I say less pleasant and more difficult to read. I’ll leave it out.”.
By “foolish” I maybe I really meant something closer “probably reasoning incorrectly”.
This derisive comment betrays the authors own hypocritical stance, claiming physicists are too close-minded, while simultaneously ridiculing the role of advanced mathematics in formulating new physics hypotheses, arbitrarily declaring them mindless fiction.
And it wasn't ever so simple as "chasing mathematical elegance instead of trying to explain observations", the problem has been that there was a theory that could explain almost perfectly everything within a certain region of physics, but can't easily be extended.
Thus you work on crazy schemes to extend the existing theory (all the sensible ones already having failed), or you are forced to make an entirely new framework, and that takes a lot of work before it's finished enough to even reproduce the results of the limited theory.
If you take the second route, you are very vulnerable to the "chasing mathematical elegance" slander, but it's not like the other guys are doing any better: there aren't actually any unexpected observations that need explaining within the reach of the existing theory.
Sounds a lot like overfitting.
Compare the classical physics formula for momentum, p=mv, vs relativistic formula, p=γmv. γ is almost 1 for most low velocities, it only starts jumping up to infinity when we get close to c.
The point being that the classical formula is pretty good in it's zone of low velocities, but as soon as you get too far out of the implicit term's "constraints" the formula breaks down and you need to add more to it to get it working for both low and high velocities. Which doesn't sound easy.
But they can pare all that down to something expressed in terms people are familiar with. And that has the bonus purpose of helping them understand why the familiar terms were familiar: the "correction factor" is small under circumstances we encounter, and only becomes large under circumstances we rarely do.
If that intrigues somebody enough to learn the actual physics, they'll encounter a completely different and more-encompassing formulation which looks not at all like overfitting. One that turns out to be more elegant, in fact, cramming more information into less notation. But it's information nobody needs until they're doing fairly advanced physics, so we're not going to be teaching it in elementary school any time soon.
This is not a "theory" problem, it has to do with matching the theory to observations.
I was concerned a few years back when a "blip" at CERN resulted in theoretical physicists publishing 300+ different theories to explain it in a short period of time. All of these theories were presumably consistent with "the foundations of physics". And I guess that "blip" got rejected as a not something worth explaining anyway.
Sounds like post-hoc overfitting to me.
The actual events of the 750 GeV peak are much closer to neural networks hallucinating, and the diversity of models created doesen’t strike me as evidence of overfitting...
Anyway, you clearly didn’t trust my summary, and I no longer trust you have an honest interest in learning more, so I’ll stop here.
If your solution to the problem of not being able to generalize your model to new/other data is to add more parameters. You are probably overfitting.
These are the hallmarks of overfitting.
Sorry that you cannot learn from others and only expect them to learn from you.
Further, of the various frameworks used, many will have been created by imposing some further symmetry on the standard theory, in effect decreasing the number of free parameters!
Second, I am reminded of Thomas Kuhn's The Structure of Scientific Revolutions  This work described exact the state, with historical examples of the cycles, whereby progress exhibits peaks and valleys, periods of time wherein little monumental progress is made followed by brief frantic periods of discoveries, often stemming from the fertile ground laid by those who worked in plodding toil.
And so I am more inclines to believe we are in such a trough at the moment and not even a particularly deep one. Various avenues of thought & experiment show amble potential to thrust us forward into one of Khun's Scientific Revolutions.
Maybe the triangle is exhausted. All mapped out. The limits of the method have been met. Time to find a new method.
History of science and the philosophy of science has shown that the foundations of sciences progress a little here and a little there until these "little progresses" gain enough momentum to create a paradigm shift. And we only recognize these "little progresses" in hindsight after the paradigm shift.
Technological advances also tend to progress science. We tend to believe that advances in science lead to advances in technology but historically, it's the other way around.
More likely than not, there are man "little progresses" being made toward an eventual paradigm shift, but until it happens, we won't recognize how important those "little progresses" are.
"But for all I can tell at this moment in history I am the only physicist who has at least come up with an idea for what to do. "
This is one heavy claim (two actually), is there some place where she elaborates what that idea is in terms more specific than "resolving inconsistencies" and "more theorists" ?
The last time we had progression in the foundation of physics we just got even more powerful world destroying nuclear weapons. Maybe it's just too dangerous to advance physics outside of deeply classified government programs. In order to keep new physics from destroying the earth, funding is diverted to make work projects for physicists working in cosmology and string theory that will never actually have practical significance.
On the topic of the LHC, particle physics, and future colliders, I like this video => https://www.youtube.com/watch?v=Go2TaEUQpF4.
Please see my response to the previous objection in the thread (the sibling comment to yours) requiring specific examples of progress which would silence the critics - it remains unanswered.
> Consistency is a good thing
Lots of close but in the end ineffectual "epistemic" systems were internally consistent but in the long run they proved "deficient" (as in other, more efficient systems took their place). I confess I've never read Thomas Aquinas's work (to give just one example) but I'm pretty sure his "view of the world/reality system" is pretty consistent, don't think there are any internal contradictions in his writings. Problem is his internally consistent system wouldn't have been able to allow us to build combustion engines or modern electronics, so that we have had to come up with other internally-consistent "epistemic systems" that proved to be more efficient (because they allowed us to build and reason about combustion engines and modern electronics).
In the end and in the great scheme of things I don't think this theoretical physics road-block will be of any great importance for the general public, it looks like people are content with what they already can purchase based on past physics-related discoveries. Yeah, traveling through galaxy wormholes or getting to 100% know if the Universe is finite or not would be nice things to have, but people just don't care and there's nothing wrong with that.
Even if we look at the "greatest" post-WW2 maths result, Fermat's Theory, I don't see any new reality-related insights that it has brought us, and I'll go another step further and I say that even if we were to someday prove the Riemann hypothesis I don't see how it would fundamentally bring new insights regarding "reality"/the physical world.
If anything, I dare say that in a certain way maths has tainted the physical world for us, has made us believe that in the same way in which mathematics is "homogeneous" then the physical world is too, if maths has numbers and "units" and if (1002 - 1000) = (2002 - 2000) then it also means that in the physical world we have homogenous "stuff".
This is why physics has started using mystical-like language like "elementary particles" which are seen as the "foundation of the physical world", with the implicit premise (if I'm wrong here, please correct me) that given a certain "elementary particle" (let's say a boson) then the boson close to it or the boson situated at the other "side" of the Universe are pretty much the same thing, almost identical, the same way as the mathematical difference I mentioned above is the same, or how two parallel lines are "the same".
Basically almost all the theoretical physicists have become Platonists by embracing mathematics no-questions-asked, when in fact they should have remained closer to Hume. And when reality hits them in the face pretty hard they resort to even more mysticism by "inventing" concepts like dark energy and the like.
Later edit: I see that that "homogenous reality" theory even has a name, Cosmological principle , and as a close-enough Hume follower Karl Popper was quick to dismiss it. By reading it you have to wonder what those physicists had in mind when they wrote it down:
> Although the universe is inhomogeneous at smaller scales, it is statistically homogeneous on scales larger than 250 million light years.
like, why 250 million is ok and 240 million light years is not ok? To say nothing of the fact that the "infinitely small" (the mystical-like elementary particles I mentioned above) are ignored completely from this discussion, they're also probably seen as "statistically the same". As I said, this "statistical sameness" has made us believe that more than half of the Universe we know of (68%, to quote Wikipedia) is made out of the mother of all mystical thingies, "dark energy".
Nothing mystical, but there certainly are deep questions, which is why we have the various interpretations of QM.
US school teacher, then geologist J. Harlan Bretz spent as much time as he could 'out in the field'. It was as a result of -extensive- observations that he arrived at his 'outrageous' Missoula Floods hypothesis. He spent 40 years defending his interpretation; he remained 'out in the field' most of that time.
His critics had spent -very- little time in the field. They knew he was wrong. In 1979, he was awarded Geology's top prize.
Pretty rich coming from someone who’s written a bunch of papers on doubly-special relativity and similarly unpromising hypotheses.
(by the blog's author Sabine Hossenfelder) https://youtu.be/oqgKXQM8FpU
(more on the confidence tldr Λ>1 still)
How can we expect to make progress in our theories if we haven’t even agreed on a consistent interpretation of today’s quantum theory?
In contrast, a bunch of explanatory stories lacking an underlying predictive model is what we call pseudo-science.
It's not a contested notion that every theory needs an interpretation.
The way to map a theory to reality is via its predictions. The interpretation is how the theory fits into my mental model of reality.
In principle, reality could be strange enough that we are incapable of holding a good model of reality in our brains that evolved to avoid getting eaten by lions instead of doing quantum mechanics.
I certainly hope that's not the case, but neither can I rule it out.
> The way to map a theory to reality is via its predictions.
Agree, and I call this mapping an “interpretation”.
(I thought this was the general usage of the word in the scientific context, but I may be wrong.)