You are right about the incentives being aligned a certain way. But, while the justification for the LHC might have been Higgs, what most high-energy physicists (theoretical and experimental) really cared about was validating beyond-the-standard-model (BSM) physics e.g. supersymmetry, hidden valleys etc.
Every search for BSM physics has returned a negative result. You can look at hundreds of arxiv papers by the two collaborations (CMS and ATLAS) that exclude large portions of parameters spaces (masses of hypothesized particles, strengths of interactions etc.) for these BSM models. If anything was found, it would be a breakthrough of enormous magnitude and would also provide justification for the next collider.
So, people have been truthful about the non-discovery of ideas that were extremely dominant in the high-energy community. This did not make them a laughing stock within the scientific community because every serious scientist understands how discovery works and the risk of working at the cutting-edge is that your ideas might be wrong. No one that I know of "made some shit up" in evidence at the LHC.
What do tenured faculty do? They either keep working on the stuff or pivot to other stuff. They are tenured - sure, some lose grant money but I know multiple physicists (very famous too) who have been working on other topics including non-physics problems.
The main criticism is whether we need these extremely expensive experiments in an era of global economic and political uncertainty. The usual argument from the physicists is that (a) we need these to advance the cutting edge of our knowledge (which might have unknown future benefits), and (b) these programs result in many side-benefits like large-scale production of superconducting magnets, thousands of highly trained scientists who contribute to other industries etc.
Whether this is a valid argument needs to be decided by the citizenry eventually. By the way, (via Peter Woit's blog) Michael Peskin recently gave a talk on the next-generation of colliders, the technologies involved and what theory questions have to be answered before making the case for funding - https://bapts.lbl.gov/Peskin.pdf
Thank you for your explanation of what else could've been found with the LHC and that a lot of work was actually done to disprove the existence of a lot of stuff.
Kinda kills my thought experiment though, but I guess that's the point. Thanks.
A great lesson one learns as a physicist is that one must develop a new intuition mainly through years of practice. So when you say "It's very frustrating", I interpret it as "this doesn't seem reasonable to me". But it doesn't have to be reasonable - none of us have any intuition for what happens at scales far different from our everyday lives.
The real question you raise is a very good one - how seriously should physicists take mathematical theories. If we were building a statistical model of, say, house prices and construct a reasonable linear regression model, we certainly don't believe that the market plugs the parameters of a house into the model to decide the price. The model is an approximation of the real dynamics of the market and this approximation might not hold in the future.
On the physics front, I would argue no one would consider a quadratic in speed air resistance term in Newton's second law, a fundamental feature of the universe. One can build a reasonable model that results in that term and it might even be a good approximation for some fluids in some speed/density range.
But, when it comes to more fundamental (as of today) theories like quantum electrodynamics, electroweak theory, quantum chromodynamics (all quantum field theories), or even general relativity (modulo discussions of quantum gravity) - both the predictive power and accuracy of these theories is so stunning (matching all the data generated at colliders like the LHC), that one starts wondering if we are no longer dealing with models but a true description of nature. The mathematical descriptions are also so constrained unlike the house price example above, that one can't just make modifications to the theories without violating core principles (and experimental data) like unitarity, causality, locality, Lorentz invariance etc. This only reinforces this view that perhaps this is close to a true description of what we see.
Now it is entirely possible (but IMO not probable) that this whole view will be upended and replaced by a very different physical picture. In a sense, string theory (which is now discredited heavily in the public's eye but that's a story for another day) was an attempt at a different physical picture that resulted in very rich structures that had nothing to do with physical reality.
So, physicists say that because the more time you spent understanding and studying quantum field theory and as more experiments are done (all the collisions at the LHC verify the standard model's predictions including the Higgs once its mass was known), it only reinforces that there's something deep about the current theories even though we have several unsolved problems (dark matter, dark energy, quantum gravity, fine-tuning problems).
Addendum 1: I'll add a book that is not accessible to non-physicists but gives a glimpse into the actual struggle of research and building intuition for something very abstract:
Feynman, like many others, spent considerable time applying all his powers to understand general relativity from a QFT perspective but eventually it didn't pan out (for anyone).
I am not an expert on their space program or the provenance of rocketry inventions but they seem to have developed both the cryogenic engine (https://en.wikipedia.org/wiki/CE-20) and the boosters in-house (https://en.wikipedia.org/wiki/Vikas_(rocket_engine) - according to the article, the initial design was based on the Viking engine back in the 1970s but haven't they developed it more since then?!). It also seems the payloads are also built in-house. I would be very interested in more details though.
that’s exactly my point. Those engines are incremental improvements on existing designs from the 1970s and 60s - hence why you will find this tech in western museums. There have been improvements - certainly - and the engines are now made almost entirely in India but the original point stands - the designs are not indian hence why we should temper our expectations of where Indian aerospace can go. They currently occupy the niche of cheap space launches and that is based on excellent work by ISRO and their partners.
That's a valid argument but still a bit strange. One could make the same argument about:
- chip design - modern processors are not fundamentally different from 8086, which also belongs in a museum. Sure, there's much higher transistor density and advances like pipelining, branch prediction, multiple cores etc. but fundamentally it's still the same physics and design.
- machine learning - aren't modern deep networks just scaled up versions (with some new architectural components although one can argue that even these were discovered in the 80s and 90s) of old ideas that still use gradient descent and backpropagation. this too sounds like incremental progress.
- commercial aircraft - aren't modern planes just incremental advances of 50-year old planes (the 747 is from the 60s). sure, they are more efficient, use lighter composites etc.
I guess my point is that either (a) technology as a whole has been making incremental progress when viewed from a certain lens, or (b) that while, superficially, a lot of technology still follows designs discovered decades ago, there have been substantial and deep improvements at various lower levels of the "stack".
Maybe a concrete way of looking at this particular issue would be to compare metrics like efficiency of the engines (is amount of thrust * time thrust was produced for / amount of fuel used = change in momentum / amount of fuel used, a useful metric?) or just raw amount of thrust produced? Then one could argue that an engine is essentially still the same as ones from a few decades ago.
Your thrust argument is borne out in the time it takes indian rockets to reach lunar orbit compared to even the USA’s saturn rockets from the 60s - a week in 2022 versus hours in 1970.
I’m not sure your comparison makes sense. The ability to develop chips is what i’m pointing out. That does not exist everywhere and developing that today is a monumental task. This is why chip design is limited to a few countries and fabrication to even fewer ones.
Out of this list, the books I am familiar with, are great (Hilbert-Courant, Spivak, Korner's books). At the same time, even with extensive mathematical training, I haven't read them from start to finish. I wouldn't even like to say "read". For someone who's not used to mathematical reading, some of these books require careful study. That means generating examples to understand results (theorems), trying your own conjectures, proving things yourself etc. Over time, one becomes familiar with most/all the material in a book but the knowledge might have been acquired through various books (and courses) over time.
Also, mathematics is a massive field. The first question would be what kinds of mathematics would you like to get better at. There are great books in analysis. If you are starting out with a solid calculus knowledge, try Abbott's Understanding Analysis [1] or Duren's Invitation to Classical Analysis [2]. For asymptotic methods in PDEs, try Bender and Orszag [3] which is a wonderful book. But again, this might not be your cup of tea at all and there are more abstract or formal books like Rudin's.
If you want to approach fields without a lot of machinery, graph theory books by Bollobas are great (but difficult). See his Modern Graph Theory book [4] as an example.
For linear algebra, one of my favorites (but it was after I already learned the subject) is Trefethen's Numerical Linear Algebra book [5]. Another beautiful topic is at the intersection of linear algebra and combinatorics. See Babai and Frankl's lectures freely available online.
Then there are wonderful topics in geometry. A massive mountain to climb would be algebraic geometry. For one starting point, see [6]. Differential geometry (Spivak's multi-volume work or Needham's differential forms book) is another wonderful area. I would recommend Crane's discrete differential geometry course at Carnegie Mellon [7] if you want a concrete introduction.
You might want to demystify a topic you have heard about. E.g. Galois theory and the unsolvability of quintic equations. You could look at [8] which guides your way through wonderful problems.
We haven't even touched huge swathes of mathematics including anything topological or number theory. Even within the topics mentioned above, once you start, your journey will take a life of its own and you'll encounter multiple books and papers opening up new sub-fields.
The only approach that worked well for me in the past was to get completely consumed by what one topic one was studying. This meant not getting distracted by multiple topics. Once one enters the workforce, this is very hard (or at least has been for me). Without knowing someone, it's hard to recommend anything but the advantage with topics like graph theory and combinatorics is that one needs less machinery (as opposed to something like algebraic geometry). These fields lead you to interesting problems very rapidly and one can wrestle with them part-time.
Yes, elephants can certainly destroy crops. But proximity is a symmetric concept. The problem has been the massive expansion of our (speaking as an Indian) population (~360 million in 1950 to ~1.4 billion in 2021) into previously untouched areas, not the migration of elephants (and other species) into Bombay, Delhi and Chennai.
One of the (two) core assumptions that Einstein made with special relativity is absolutely that the speed of light in vacuum ("c"), is constant in every inertial reference frame/coordinate system. This has been borne out experimentally and the consequences of special relativity are seen every day at any particle accelerator (for e.g. time dilation of lifetimes of particles).
The full equation is:
E = mc^2 / sqrt(1 - v^2 / c^2)
This is coordinate/frame-dependent since it depends on speed. Here, m = mass of the particle which is also the same in every reference frame. You could define an effective mass as m' = m / sqrt(1 - v^2 / c^2) but that obscures the point imo.
To address your point directly, when you say "as the energy to mass ratio goes up, the speed of light goes up", that is just not true both mathematically and physically.
While that's definitely true, maybe one should look at this another way.
$400/mo saved in a 0% interest bank account for 40 years = $192,000
$400 * 12/year invested at 10% (incl. inflation) for 40 years = $2.12 million
While $2.12mm in 2020 money is $440,000 in 1980 money (assuming 4% inflation), it's still much better than the alternative.
A couple of other points:
* Investing $400/mo earlier is relatively harder than $400/mo later in the 40-year timeline assuming compensation increased by the inflation rate but one keeps the invested amount ($400 here) constant. This is ignoring factors like salary increases later in one's career (not sure if that even generalizes for lower-income professions).
* I wonder if there's any study done tracking if there's actually a steady increase in spending consistent with the inflation rate for all individuals (with some stochasticity). Or, do people adjust the goods they buy and "outwit" inflation? For e.g., the biggest purchase one makes usually, a house/apartment, doesn't scale with inflation. Similarly, electronics actually get cheaper. So, apart from food, does spending actually track inflation?
>> The program is commonly used as a bridge for high-performing students to enter the U.S. job market, especially in tech.
Is the idea that students on OPT or employees on H-1Bs are all high-performing geniuses a myth that is believed widely? The implication (or maybe I am reading too much into one sentence) is something that I have seen quite often. Every student from an accredited university is eligible for a one-year OPT. Every student with a STEM degree is eligible for the STEM extension. The field one works in does not have to do anything with the degree since the case is often made that the skills learned during the degree program are highly transferable. In my experience, it was very rare to not get an OPT or an extension approved. This made a masters degree the cheapest and the fastest path to immigrate into the U.S. Whether this is fair or not is a collective judgement call by U.S. citizenry but it would reasonably be given very low or no priority when millions of citizens are suffering economically.
Every search for BSM physics has returned a negative result. You can look at hundreds of arxiv papers by the two collaborations (CMS and ATLAS) that exclude large portions of parameters spaces (masses of hypothesized particles, strengths of interactions etc.) for these BSM models. If anything was found, it would be a breakthrough of enormous magnitude and would also provide justification for the next collider.
So, people have been truthful about the non-discovery of ideas that were extremely dominant in the high-energy community. This did not make them a laughing stock within the scientific community because every serious scientist understands how discovery works and the risk of working at the cutting-edge is that your ideas might be wrong. No one that I know of "made some shit up" in evidence at the LHC.
What do tenured faculty do? They either keep working on the stuff or pivot to other stuff. They are tenured - sure, some lose grant money but I know multiple physicists (very famous too) who have been working on other topics including non-physics problems.
The main criticism is whether we need these extremely expensive experiments in an era of global economic and political uncertainty. The usual argument from the physicists is that (a) we need these to advance the cutting edge of our knowledge (which might have unknown future benefits), and (b) these programs result in many side-benefits like large-scale production of superconducting magnets, thousands of highly trained scientists who contribute to other industries etc.
Whether this is a valid argument needs to be decided by the citizenry eventually. By the way, (via Peter Woit's blog) Michael Peskin recently gave a talk on the next-generation of colliders, the technologies involved and what theory questions have to be answered before making the case for funding - https://bapts.lbl.gov/Peskin.pdf