> Lines 1, 2 and 3, together, uniquely determine special and general relativity, cosmology, the Hilbert Lagrangian and Einstein's field equations, as told here.
This is a huge exaggeration.
For example 1 says "W=∫L" but L is never defined. The definition of L hides a lot of details, in particular that each point of the universe is equivalent, so there is translation invariance that using the Noether's theorem implies the conservation of momentum.
There are many definitions of L, for example if you are analyzing bead that locked to a circular wire, it's usual to define L with a coordinate that follows the shape of the wire. There are also definitions of L in spherical coordinates, where each coordinate is different. Anyone with a Major in Physic has nightmares about the posibilites to define L.
For example in 2 "v ≤ c" can be true in a stupid universe where the force is bounded and there is enough friction everywhere. Imagine a universe that is full of "water" or "aether" or other magical thing with magical friction. And for some reason the force is bounded, for example like the post says in 3 "F ≤ c4/4G"
Also, I can't imagine how it is possible to rediscover all thermodynamics from "S ≥ k" in item 5.
These kinds of comments block me from doing a physics dive; I don’t know where to start/am worried about starting from a bad point. I don’t know whether to follow some of the trails the linked author has laid out if I want to learn more, whether you’re nitpicking, or whether you’re advocating for a much better approach that starts with the details you say this obfuscates. When I was younger and had more time I’d be willing to eat the inefficiency, and the bigger reason for not doing a deep dive is the difficulty, my own laziness, and lack of practical need, so take what I’m saying with a grain of salt, but the time needed to parse criticism like this to see whats legitimate and whats not drives me a bit nuts.
I’ve done enough self learning to know that there is no perfect starting point/backtracking is inevitable, but something about the modern intellectual landscape feels very noisy/jargony and particularly prone to inefficient learning.
When you say “there is translation invariance that using the Noether's theorem implies the conservation of momentum”, I have no idea what that means. It sounds like a description of a turbo encabulator. I’m sure its not, and that your description is much more efficient than spelling it out from the ground level, but it annoys me how difficult it is to distinguish between a legitimate domain expert jargon meant to condense complexity and unnecessarily obfuscatory jargon that works more like a status/club membership symbol.
Maybe its always been like this somewhat and that problem is inevitable, idk.
One thing about the traditional sequence of physics topics, is that it progresses (more or less) in parallel with the traditional math sequence. So you can get yourself up to speed on both simultaneously.
Also, you don't need to get all the way to the whiz-bang stuff, for it to be both enjoyable and useful. Granted, my degree was 30+ years ago, but today I specialize in designing measurement equipment. If all you end up learning is calculus plus how some mundane things work like circuits and optics, it would hardly be a loss. I don't even claim to understand fundamental stuff like field theory.
There's also a problem with explaining physics: Most of us who claim to understand it (I have a physics degree) have never attempted to explain it or even think about it without math. There's the commonly shared anecdote about physicists trying to grasp the "non calculus based" college physics course, and finding it too complex and baffling to proceed.
> how difficult it is to distinguish between a legitimate domain expert jargon meant to condense complexity and unnecessarily obfuscatory jargon
Yep. It's a problem at all levels. Sometimes it's also difficult to distinguish real science from well written crackotery
You can look at the webpages of a few universities. Many of the course have the list of official bibliography visible, and that's a good start. Try to follow the same order of courses, it's impossible to understand quantum mechanics without a good base of classic mechanics.
Sometimes the main book for a physic degree is too technical. You can also try to read the Schaum's Outline book for the topic. They have a lot of examples and exercise, but the theoretical part is shorter. I like them as a side book, but if you don't want to become a super expert, they are fine.
If you want books without math, that's a problem. Some are good, some are bad, and it's difficult to distinguish. There are a lot of fun topics that you can learn without too much math. For example there are a lot of things you can learn about particle physics imagining that quarks are just small balls, but some technical details are too difficult without math (for example why there a 8 gluons instead of 9). There are some good collections of divulgation of science that don't have too much math and are checked by a good editorial team.
> Noether's theorem
The main idea is that if you magically teletransport everything in the universe one mile to the right, then nobody will notice the teletransport. It's important that it's true if you choose any other direction (what does "to the right" mean?) or any other distances (like 1 feet, 1 light year, because "1 mile" is no special).
Obviously nobody has tried this experiment, but we as far as we know the laws of physics are the same everywhere, for example the mass of one proton is like 2000 bigger than the mass of an electron here and the mass of one proton is like 2000 bigger than the mass of an electron in a lab on the other side of the earth and the mass of one proton is like 2000 bigger than the mass of an electron in Andromeda. So this though experiment is just a good guess (if you ignore the curvature of the universe due to General Relativity).
But if this guess is correct, then the Noether's theorem says that momentum is conserved, that is a property that is verified in a lot of experiments. So it initially looks like a abstract magical though experiment, but the consequence is that there is an important number that is a constant in each real experiment.
This constant numbers sometimes simplify some calculations a lot. It's similar to the conservation of energy that in some cases is useful to prove that something is impossible or get the final result without looking at the nasty details of the experiment.
The magical translation of everything in the universe is a symmetry, and the idea is that you can discover other symmetries of the universe. Ignoring some technical details, then you can use the Noether's theorem to discover new numbers that are constant in all experiments.
Sometimes the symmetry is a symmetry of all the universe that is easy to see, sometimes it's a more abstract symmetry, sometimes is just a symmetry of the experiment and you ignore the rest of the universe. There are many applications of the Noether's theorem.
I have a PhD in physics and left academia 25 years ago.
I would divide the learning of physics in four categories:
- the initial discovery part where you read about something for the first time (nutation, baryons, ...). It is best to have a shallow but interesting approach then, with some (or lots of) liberty on the precision.
- the "things are related" part which was extraordinary in my case. It is the part (about 2 or 3 years into the physics curriculum) where you discover that some things are closely related and you can reuse bits and pieces of what you knew from the previous part to see the "big picture". That part is the most fruitful because you can have your own internal discussions about physics and they are not difficult to confirm or not.
- the "let's dive into the details". That one is tough, very tough. You have a lot of details where you can miss what you are actually learning. There is a lot of math involved, up to voodoo math such as renormalisation. This is for the ones who want to make physics the main topic of their life (= academia).
- the "realization" part when you make peace with a lot of things you learned, where you trust math to drive some parts of physics, where you finally understand that there are some things that you will not understand (or that are not understandable with the current knowledge). This is a phase you get to sometimes after not having done physics for some time.
I would warm recommend to spend some time on part one with some "general public" books to get a hint of what is awaiting you. Then to go to step two with a introductory/mid-level book for students of physics, in the "introduction to physics" part. And then look further to selected areas if you want.
Start from conservation laws, get a feel for the typical mechanics problems. Acceleration, forces, work, energy, momentum. (At first don't try to understand WTF all that has to do with symmetries/invariants.) Then a bit of EM. (Circuit analysis, RLC circuits.)
Eventually you'll need some undergrad level math. To get a few very concrete (hehe) examples maybe look into a bit of structural dynamics (resonant frequency, tuned mass damper, mass distribution described by a matrix).
Then eventually you need to be able to solve simple differential equations and know what Fourier has to do with radios, and how mechanics depends on calculus of variations.
The more examples you work through the more you'll know what you want to learn about.
My recommendation is don't try to deep dive into one topic, instead start to look for similarities in other topics, basically do a breadth-first search.
I know that is not really what your comment is about, but symmetries, conserved quantities and Noether’s theorem are some of the neatest bits of fundamental physics.
Basically you observe that stuff stays the same if you move around (not true on earth, but like, in space, generally speaking). The physics is invariant under translation. That’s what we call a symmetry.
Now Emmy Noether came up with a relationship and formulae that connects each such fundamental symmetry with a conserved quantity. For translation the quantity turns out to be momentum. Conservation of momentum is one of the most fundamental building blocks of classical mechanics. And we can derive that it is in fact conserved from something as simple as a symmetry.
And it doesn’t stop there: next in line: invariance under rotation: angular momentum conservation, which is equally important as momentum conservation. If angular momentum wasn’t conserved, our universe would look entirely different.
Ages later people figured out that this symmetry business goes way deeper than initially thought. These U(1), SU(2) and SU(3) from the posted link are notation for other, less intuitive, symmetries. Here they don’t apply to space itself anymore but to quantum fields. With that and Noethers theorem (and some admittedly kind of involved maths) you can derive theories describing all electromagnetic and nuclear interactions.
So basically, these symmetries are effectively at the heart of the world (and physics) and I think that’s amazing.
Physics is most illustrative in experimentation and good enough approximations. None of the above plays a role in undergrad as far as I can tell, unless perhaps if going for ivy league.
There is a very good starting point following the historic developments, maybe iron age stuff, a bit of platonic philosophy, including logic, rhethorics and philology on an a-level level. purely theoretical physics is rather math heavy final year, post grad stuff.
The linked page is crankery, but most of the physics it's vaguely gesturing at is standard undergrad material. QFT is not, but it's not too far out of reach either.
>I don’t know where to start/am worried about starting from a bad point.
Start from the basics. You will get plenty of ability to analyze real-world problems from learning Newton's laws. Lagrangian mechanics rarely even applies in engineering contexts because it can't do friction, air resistance or other dissipative terms.
Lagrangian mechanics can handle dissipative systems. In many cases, friction included, it's as simple as adding a dissipative term to the Euler-Lagrange equations. This seriously complicates the process of deriving them, but physicists were always handwaving their way through it anyway.
Hamiltonian mechanics faces more serious obstacles, though if you really want to you can engineer a noncanonical symplectic structure to capture the dissipation. Wouldn't recommend it though.
If you want to dive in to physics on your own, a textbook can be a decent starting point. Griffiths is pretty good, although if you’re starting from zero, it’ll be a challenge.
Step 3. Refer to internet to give relief and contrast to your mental model.
Step 4. Take notes
Step 5. When you finish that textbook, use your knowledge as a foundation to select and consume the next.
Repeat from step 1.
There’s a self-learning approach absent of any dependence on ycombinator comments. And it’s plenty efficient—there aren’t even exams if you don’t want to take them!
Oof. I don’t see how you recover any of thermodynamics. S ≥ k is a really odd axiom to start with. It certainly doesn’t superficially imply the second law. It’s also rather at odds with the usual statistical formulation in which the entropy of a system in its unique ground state (or simply a system without meaningful microscopic states) is zero. Imagine trying to describe the behavior of a single (idealized) particle in a box using this law — you would get very confused very quickly.
For non-physicists: you can model a gas as a bunch of almost entirely non-interacting particles in a box. You can assume that there is some way for them to exchange energy, but you don’t need any details at all of how that works. With some care (and IMO a really fascinating series of arguments) you can derive the ideal gas law and a whole lot of useful thermodynamics. As a first hack at reducing it to one sentence, I might try “at equilibrium, each unit of phase space (or each discrete microstate in a discrete system) is equally likely.”
(I personally find statistical mechanics much more intuitive than classical thermodynamics. I have always felt like E, H, F, and G are mathematically correct quantities that come from manipulating partial derivatives but that they mostly lack intuitive value. The quality of the average thermodynamics textbook doesn’t help.)
> The quality of the average thermodynamics textbook doesn’t help.
The very beginning of Huang’s statistical mechanics text is a really good, really short summary of axiomatic thermodynamics. The bulk of the book is 60s-era stat mech models, but the first couple dozen pages contain a surprisingly lucid exposition of the classic story of Clausius, Thompson, and Carnot. Rumer and Ryvkin is also a good book that explores things purely from a perspective of phenomenological axiomatics before plunging into statistics, and includes for example a discussion of how the potentials determine the equilibrium state under various conditions.
Unfortunately, neither of these subsumes the other, and I wouldn’t really recommend either as a first-time introduction, so that mental spot is still empty for me. But as a way to demystify thermodynamics (which I find myself reaching for every few years with disturbing regularity) they serve well.
(I also have a stack of various people’s notes that finally made the Legendre transform click for me lying around here somewhere, if anybody wants them.)
As a recovering theoretical physicist, things like this which is primarily an attitude are tiresome. I'll even go further down the line, just because you can write down the "equations of motion" often doesn't give you anything but another starting point.
I will always be amused when people who study particle physics do everything after 2nd quantization and thus assume that literally understanding plane wave interactions explain all physics, rather than such methods are actually reductionist and have their own attendant assumptions that fail at certain points (like the UV limit for QED), but that means you really don't "understand" all physics just by writing down the Lagrangian for SM, it's just a starting point. "Understand" for a scientist shouldn't be abstract, it should be "given x equations/model I can predict y output" which is very hard for a lot of physics, otherwise there wouldn't millions of scientists doing things other that just doing string theory derivations everyday.
Kind of the same as positing that a Lisp meta-circular evaluator or Smalltalk (syntax) fits on the postcard. There are so many more elaborations on semantics and further definitions missing
Indeed, the author's 2000-page online textbook, heavily promoted on the internet, is a classic trap for unwary students.
It looks alright at first: volume I is light on math, but full of neat examples. But it's full of intuitively plausible but slightly wrong statements which fall apart in more general situations, reflecting the author's lack of technical expertise. This problem steadily gets worse: volume IV is an oversimplified introduction to quantum mechanics which contains almost no math, and serious conceptual errors on almost every page. Volume V covers a bizarre mix of particle physics, consciousness, and sexual reproduction. And volume VI is the author's almost math-free personal theory of everything. Because the change is gradual, a student can get seriously misled without noticing, like the proverbial boiling frog.
On HN, people are always asking how to get started self-learning topics like physics. The tragedy is that this has been a completely solved problem for decades: the standard textbooks are excellent. But people don't hear that message because self-promoters pollute the discourse.
A caveat. Some years ago, at a first-tier university, some physicists and mathematicians were munching. A physics professor described how days earlier he thought he had found a case of a well-respected intro physics textbook saying something wrong. But, after some hours and days of thought, he realized the textbook was very carefully worded so as to not be incorrect. Yay. Most everyone smiled and agreed it was an excellent textbook.
A bit later, there was a quiet out-of-band question: So... if you're already an expert on the topic, and do a close read, after thinking about it for days, you will escape being misled... and this is a win??
There's an old physics education research joke: If you think your lectures are working, your assessment also isn't. I've found that to apply to much science education content as well.
Sorry, I have to disagree with this, at least with respect to quantum mechanics. The pedagogy of QM is atrocious because it generally focuses on the single-particle case and relegates entanglement to the sidelines while making a big deal out of the mystery of the measurement problem. This leaves students hopelessly confused. At least, it left me hopelessly confused for about ten years. Even today one hears physicists speak un-ironically of "quantum erasers changing the past" and other associated nonsense. If there's a standard text that inoculates against that, I have not seen it.
> Can you point to a standard textbook that does this?
That does what? Focus on the single-particle case and punt on measurement? My two poster children are the Feynman lectures and Griffiths.
> The ones I'm familiar with definitely don't shortchange multi-particle problems.
What does your reading list look like? Maybe things have changed since I last looked.
> the measurement problem not a mystery?
It might be a mystery, but it is not the mystery most commonly presented, namely, that particles change their behavior "when somebody looks." This is nonsense. Measurement has nothing to do with "somebody looking", it is just entanglement + decoherence. The only real mystery is the origin of the Born probabilities.
A) I don't consider the Feynman lectures a "standard textbook." I don't think there exists any university that uses them as the primary reference in their quantum course. They're fine, as far as they go, but I think modern pedagogy is better.
Concerning Griffiths, what do you feel it lacks? You've got the hydrogen atom, fermions, bosons, helium, and probably more stuff that I'm forgetting right now. What else would you stick in an intro course? Hartree-Fock?
B) Decoherence doesn't solve the measurement problem. Even the decoherence boosters admit this. See, for example, Adler's paper on this: https://arxiv.org/abs/quant-ph/0112095.
This isn't to say the decoherence program isn't important. I think it is. It just hasn't solved the measurement problem.
What Griffiths lacks is an explanation of what a measurement is. He, like many other authors, explicitly avoids this because he says that measurement is an ineffable mystery, but it isn't. A measurement is a macroscopic system of mutually entangled particles. The only real mystery is why the outcomes obey the Born rule.
Decoherence does not solve the whole measurement problem. Like I said, it does not explain the Born rule. But it does solve parts of the measurement problem. Decoherence explains why measurements are not reversible (they are reversible in principle but not in practice because you would have to reverse O(10^23) entanglements). It explains why only one outcome is experienced (because you are part of the mutually entangled system of particles that constitutes the measurement, and all of the particles in the system are in classical correlation with each other). I don't know of any standard text that discusses this at all.
Whether or not Feynman is a "standard text" is quibbling over terminology. A lot of people learn QM from it (or at least try to).
I'm sorry, but your description of how decoherence purportedly solves parts of the measurement problem is incorrect.
Even decoherence researchers agree that docoherence theory does not do this. You can find references and details in the Adler paper I linked, or in Schlosshauer's "Decoherence, the measurement problem, and interpretations of quantum mechanics." (Schlosshauer is the author of a main reference on docoherence: http://faculty.up.edu/schlosshauer/index.php?page=books.)
So, the reason that Griffiths avoids giving the explanation of measurement you prefer is that it is wrong. It's a virtue of the book, not a fault. He does discuss decoherence on page 462 of the third edition, though.
> Even decoherence researchers agree that docoherence theory does not do this
Yes, but they are wrong. And it's not hard to see that they are wrong.
The crux of the argument is that the state predicted by QM:
|S1>|A1>|O1>|E1> + |S2>|A2>|O2>|E2>
where S is the system being measured, A is the measurement apparatus, O is the observer, and E is the environment, is not what is observed. What is observed is either:
|S1>|A1>|O1>|E1>
or
|S2>|A2>|O2>|E2>
neither of which is the predicted state above. Except that it is because |S1>|A1>|O1>|E1> is what is predicted to be observed by an observer in state |O1> and |S2>|A2>|O2>|E2> is what is predicted to be observed by an observer in state |O2>. It is not that the prediction is wrong, it is that you, a classical observer, are not sufficiently omniscient to see both observations. You can only see one or the other. And this too can be explained, though by quantum information theory rather than decoherence theory. In order to be a classical observer it is necessary to be able to copy (classical) information. The only way to do that is to discard some of the (quantum) information contained in the wave function. Being non-omniscient (i.e. being unable to directly observe a superposition) is a necessary precondition of being a classical observer.
After reading your comment I looked at the original post with a view to identifying its basic perspective.
https://www.motionmountain.net/9lines.html was clearly not written by a physicist, as it includes the line "The nine lines contain physics, chemistry, material science, biology, medicine, geology, astronomy, engineering and computer science. It appears that the nine lines contain all natural sciences!" which is not a physics type of statement.
Obviously, physics contains everything, just the way "bits contain all software". Observing that doesn't make someone a programmer, in my opinion.
Why stop at physics, chemistry, material science, biology, medicine, geology, astronomy, engineering and computer science (list provided).
However, it is kind of like saying text comprises all of the words of Shakespeare. Well, sure, but that doesn't mean that you get Shakespearean criticism from writing a text editor. It's two different fields.
However, the lines "Isn't this incredible?" and "Enjoy searching for answers." are sarcastic and disrespectful to the work actual physicists put into their discipline for decades and the author should be ashamed for this tone.
If they want to be a physicist they should observe and learn from physicists at every opportunity. (I'm not one, by the way.)
Ignoring the rude sarcasm, we can correctly summarize the document as "Physics governs everything. Period."
It's not a physics document. It is (or rather ought to be) a tribute to physicists and the unbreakable laws of physics.
Thank you for your candid feedback. I belong to those working in physics for decades. To avoid such misunderstandings in future, I improved a few lines.
One remark: the text is really meant to state that the specific 9 lines given do contain all of physics (and thus all other natural sciences).
The argument differs from "words and Shakespeare": the nine line describe nature exactly, within measurement precision. That is the central content of that page. It is not that physics governs everything. It is that those specific 9 lines describe all measurements, all observations, and contain all equations. This is a much stronger statement.
The Physical Review is a gigantic set of journals, and like anything of its size, many of its published results are wrong. At this very moment I'm writing a rebuttal to a PRD paper that arrived at nonsensical conclusions due to some basic algebra mistakes.
"Several"? Google Scholar shows two, one of which is an approximately 1-page comment on a rebuttal to the other one. Neither mentions "strand theory," the focus of the criticism in that link.
1. Total compute capacity is limited and redistributes toward areas of high activity.
2. Processor speed is limited.
3. Mutations are sensibly constrained to prevent overflow.
4. Smallest addressable memory space.
5. No process is ever fully idle.
6. IPC is done using circles.
7. The kernel however operates at the level of spheres, and pointers to pointers to pointers to spheres.
8. There's roughly 18 niceness levels, but rumors are there are a couple more.
9. About 25 programmers worked on the project and at the end each one got to pick a nothing-up-my-sleeve number. Some say there are Easter eggs hidden there yet to be discovered.
It makes intuitive sense to me that computers are physically constrained in all of the ways (and more) that the universe itself is physically constrained. I don't know why the inclination is to take these fundamental constraints and then say "the universe is in a computer" instead of saying "the computer is in a universe".
> This pdf file is licensed under the Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 Germany Licence, whose full text can be found on the website creativecommons.org/licenses/by-nc-nd/3.0/de, with the additional restriction that reproduction, distribution and use, in whole or in part, in any product or service, be it commercial or not, is not allowed without the written consent of the copyright owner. The pdf file was and remains free for everybody to read, store and print for personal use, and to distribute electronically, but only in unmodified form and only at no charge.
Doesn't that "additional restriction" negate the CC-BY-NC-ND license? What does it even mean?
The 9 equations aspect is pretty pointless but the chapters on the right hand side could be ok.
One of the things that makes physics different to most subjects is that we can make an explicit goal of being smug and rederiving the laws of physics from the simplest possible principles. I think the book seems to do this, but just summarizing like done on this webpage is a bit daft.
I think the way these are phrased do a better job expressing the "vibe", for lack of a better word, of our current understanding. That the particles and constants are declared by fiat, with behavior constrained by simpler laws, is an understanding I didn't come to until I took a QFT course. The physicist's God is as concerned with forbidding seemingly pointless things as the biologist's God is with beetles.
And yet the vibe persists! Presumably it's due to low-level patterns you subconsciously pick up on, but much of the much-discussed physicist-intuition comes from knowing how the universe "likes" do do things. There's a similar thing in math, with many of the greats having just a few tricks they used over and over to great effect in any number of fields. I guess we're used to dealing with people, so the pattern is expressed as a preference (which does double duty encoding that it's not an absolute law).
If large chunks of our brainpower are actually hardwired to think in terms of other humans¹, anthropomorphism might just be a strategy to recruit those hardwired systems into helping out with other tasks.
I've read a few times that you can derive the Maxwell equations from knowing that U(1) exists, as this article implies, that every point in space has (among other things) the symmetry of a circle.
But I've never seen this derivation. I assume it's either trivial or too complicated. Anyone has a link or thoughts on this?
I wrote an ELI25 explanation in a comment a few months ago, and I'll copy it here:
In quantum mechanics the wave function Ψ is has complex values. If you multiply everything in the universe by -1, nothing changes because all the physical results use ΨΨ* (where * is the complex conjugation). You can also multiply everything by i or -i. Moreover by any other complex number of modulo 1 because ΨΨ* does not change. (The technical term for this is U(1) global gauge symmetry.)
But you can be more ambitious and want to multiply each point of the universe by a different complex number of modulo 1. ΨΨ* does not change but the derivatives of Ψ change and they are also important. (When you use the same complex number everywhere, the derivatives is just a multiple of the original derivative. When you use a different number in each point, it changes.)
The only way to fix the problem with the derivative is to add a new field A. When you and multiply each point of the universe by a different complex number of modulo 1, then A changes in a simple to calculate but not obvious way. The change in A fix the problem with the derivatives of Ψ.
So now the equations of the universe with Ψ and A don't change when you make this change. (The technical term for this is U(1) local gauge symmetry.) When you write carefully how a universe like this look like, the new field A is electromagnetism. (Actually, you can get the electric field and magnetic field using the derivatives of A.)
> I assume it's either trivial or too complicated.
It's trivial once you have 3 or 4 years studding Physics, but you will never understand how it is related to the magnets in your refrigerator. [There are like 5 simplification steps between U(1) and magnets in the refrigerator. Each one makes sense, but I can't see all of them together in my head.]
Add to your theory/model. In order to make the symmetry work, you need something to "fix" the discrepancy you would otherwise have in the equations. So you add this field (variable) to make the whole thing consistent. In the end, you see you needed this "gauge field" (for historical reasons these are called "gauge symmetries") which is electromagnetism.
If A and B are zero then F is zero and you can forget the second part of the right hand part of the equation. Also, you must replace D with ∂.
Now you have equation of electrons and positrons that move in a universe that has no electromagnetism. The important part is that in that equation, the only variable is ψ(t,x,y,z) that appears twice, the rest of the things written there are just derivatives, constants or indices.
Now you can turn on B(t,x,y,z) that represents the external field, so the electrons and positions move in a more interesting patterns, but they don't "see" each other. Again, the only variable is ψ(t,x,y,z).
Now you do the trick with a local U(1) symmetry, and the only way to do the trick is to add a new variable A(t,x,y,z). But you must use the complete equation because as explained in Wikipedia F(t,x,y,z) is calculated using the derivatives of A(t,x,y,z). So now you have two variables ψ(t,x,y,z) and A(t,x,y,z).
So you "add a new field A" to the list of variables that appear in the right hand of the equation, or to be more precise, you get a new equation that has one additional variable A.
In short, if you impose a U(1) symmetry on your Lagrangian, you get electromagnetism. Basically you do the action of U(1) and enforce that the equation of motion does not change. That gives you the electromagnetic force (a photon).
Putting in some of those keywords will find plenty of detailed notes online, like
There are plenty of so called "derivations of Maxwell equations", however, I would not claim that they were based on U(1) necessarily. IMO ME are just the integrability conditions for any conserved quantity (continuity equation): take a generic three-form in R^4, then dJ=0 => J=dF. F has six components corresponding to two vectofields in R^3 which depend on the fourth coordinate and satisfy the div and curl relations as in ME ;)
U(1) symmetry (R also works in a classical setting: it's the Lie algebra that matters) is exactly how you get a "generic three-form" to drop out of the action.
Can somebody with knowledge in biology and/or physics take a look at (one of) his book and comment if it's decent as in if the science is solid? I browsed https://www.motionmountain.net/motionmountain-volume5.pdf and I'm intrigued (there are no reviews in Amazon).
I skimmed the first ~150 pages of that and I'd say it's exactly as cranky as you'd think for a 400-page-long self-published book about science.
Real equations are introduced without defining any terms or notation and then quickly abandoned for discussion of other things. For instance, the QED Lagrangian is brought up on page 126 without defining any of the highly-specialized notation involved (the slashed partial derivatives) and then the discussion moves on from it without _doing_ anything with it.
The QED Lagrangian gets only slightly more words than "the three lightbulb scams" (p114). The term "spinor" is used several times but defined exactly zero times.
The truly "wat"-inducing parts are the "Challenges" sections, for instance the first one on p29. An example: "Challenge 12: Do birds have a navel?"
The earlier ones aren't terrible, but yeah they get real bad real fast. There are lot's of good introductory resources from people that know what they're talking about.
Obviously this won't help you find the particles but this seriously isn't far off capturing how shocking symmetry groups are as a tool for building theories in particle physics. The fact that you can reduce so much physics down to some invariance is very spooky.
the mass of the up quark
the mass of the down quark
the mass of the charmed quark
the mass of the strange quark
the mass of the top quark
the mass of the bottom quark
4 numbers for the Kobayashi-Maskawa matrix
the mass of the electron
the mass of the electron neutrino
the mass of the muon
the mass of the mu neutrino
the mass of the tau
the mass of the tau neutrino
4 numbers for the Pontecorvo-Maki-Nakagawa-Sakata matrix
the mass of the Higgs boson
the expectation value of the Higgs field
the U(1) coupling constant
the SU(2) coupling constant
the strong coupling constant
the cosmological constant
I opened a volume of the books linked in the page, and found this:
« Learning allows us to discover what kind of person we can be. Learning widens knowledge, improves intelligence and provides a sense of achievement. Therefore, learning from a book, especially one about nature, should be efficient and enjoyable. Avoid bad learning methods like the plague! Do not use a marker, a pen or a pencil to highlight or underline text on paper. It is a waste of time, provides false comfort and makes the text unreadable. And do not learn from a screen. In particular, never, ever, learn from the internet, from videos, from games or from a smartphone. Most of the internet, almost all videos and all games are poisons and drugs for the brain. Smartphones are dispensers of drugs that make people addicted and prevent learning. Nobody putting marks on paper or looking at a screen is learning efficiently or is enjoying doing so. »
I can't say what the author's intent was, but I see two ways to parse this: one I disagree with, one I agree with.
The disagreeable interpretation is that the only way to learn is from a pristine book. This is so obviously stupid that I see no reason to discuss it further.
The other interpretation, which I think is more useful but unclear from the text, is that _passive activities are subpar learning activities_. From what I've read of effective studying techniques, things like highlighting text don't help. Active recall techniques like spaced repetition are much more useful for cementing learned knowledge. Likewise, if you're "learning" from a screen by just staring at it, consuming it, then your knowledge-to-inputs ratio is going to be low. I've watched plenty of YouTube videos on how to do numerous things, but I couldn't do any of them on the spot because I've never tried. The knowledge went in one ear and out the other. The material must be engaged with somehow.
> The disagreeable interpretation is that the only way to learn is from a pristine book. This is so obviously stupid that I see no reason to discuss it further.
Agreed. I actually prefer to see markings in used books I'm reading. Not only does it provide possibly useful context into how another person is thinking about the topic, but it even makes it feel like there's a bit of friendship or community in the book.
highlighting and underlining don't per-se cause you to learn. However they do have two useful properties.
1. The act of looking for and underlining or highlighting text helps you to focus on identifying core ideas or concepts in the text.
2. Underlined and highlighted text are useful as hints when reviewing a text and reviewing a text is a critical part of learning.
These properties are so helpful that many texts actively help you to absorb their content by bolding, underlining, or otherwise calling attention to core concepts and ideas for you.
> Do not use a marker, a pen or a pencil to highlight or underline text on paper. It is a waste of time, provides false comfort and makes the text unreadable.
What a load of elitist BS (pardon my French). When I was at University we were trained early on to cut the notion of a book being sacred. We should mark, note, create meaning on the pages with pens.
I would not have been able to learn as much in my life if I had not taken this advice seriously.
> In particular, never, ever, learn from the internet, from videos, from games or from a smartphone.
Yeah. Exactly. OK. So I have not learned new things recently about both World Wars through documentaries telling me about findings that I did neither learn in school or at university studying history (among other subjects).
I know. I am just anecdata. But wow. What a view on learning in the quotes book.
I think the author does have a point though it clearly doesn't apply to everyone. 'Elitist BS' is rather strong. You take a different view but perhaps you've never studied in a part of the world where books are in short supply and one expensive copy is shared. I underlined passages in books in the past and now regret imposing my then partial view via permanent marking. Just a thought but a cataclysmic extinction event on Earth might well either destroy most electronic media or put it beyond general access. Surviving physical books might then become generally sacred once more as they are in many people's present day collections. We're rather too ready to think that the current ease with which we can check out almost any topic will be forever a permanent fixture. There are days when I doubt it.
I have highlighted and marked books and I tended to buy the cheapest copies possible (or buy used books) as I was quite tight on money when I was at university.
What I discovered was that I could trace my understanding and my development towards concepts through the times when I returned to a book that I already had worked with before. My older notes and highlights were often the starting point but with my then current understanding I built on them, contradicted them in parts and sometimes laughed about my former ideas. It made me more humble towards knowledge generation.
To me it were a blessing to have different layers of marks, notes & highlights.
If a cataclysmic extinction event occurred, I doubt a little bit of highlighting or underlining is going to cause significant burden on mankind's ability to relearn things. I do agree though, that if you don't own the book you probably shouldn't mark in it.
I sort of agree with the book on highlighting and underlining. If you don't jot a note down in the margin, you'll forget your state of mind and have no idea what made some underlined section meaningful. A false comfort.
When I taught LaTeX classes, I would often see students highlighting whole pages of the class notes I passed out. I often said that I should just print them on yellow paper to save them the trouble.
Marking the essentials (or better essentials based on your your current understanding) is different from marking whole sections or pages.
I remember one of my teachers saying he needs some books like Goethes Faust about every 10 years because of new understanding makes him mark different passages (in a different color) than on a previous reading. And after teaching and using the books for about 10 years he needs a fresh page to mark.
Yeah, this is clearly BS. Recently I changed the turn switch on my car. How did I learn how to do that? From the Internet, specifically from watching videos. It was easily the most efficient way to learn.
If you steelman the author's claim, the unsaid interpretation is, the optimal way of learning Physics is to solve questions. Lots of them. On the order of thousands for an undergrad Physics curriculum.
Reading the textbook and getting some concepts out is only a precursor to real understanding which is when you apply those concepts to scenarios in questions. Far too many people get stuck thinking reading the book and highlighting is sufficient, but it is only the foreplay not real sex.
I like it. The author is being real. This is what he sincerely thinks. You don't have to agree with everything an author says. I for one appreciate the candor.
It's also a proven fact that many smartphone applications are effectively dispensers of dopamine at regular intervals. People check their social media like smokers take a smoke break. The author is right...albeit a bit terse and dramatic.
I think he's just old fashioned and believes learning from physical paper books is superior. He even has a link at the bottom of his site that says "Paper book lovers."
But we have studied learning and how memories are formed. I hate when people make stuff up because it worked for them it must work for everyone. Things like spaced repetition and Khan academy are completely valid ways to learn.
no there are a lot of old academics that think this way, it’s almost like a form of self-flagellation… if you’re having fun then you’re not learning hard enough
one professor I knew like this (decades ago now) was the type of person that would violently throw chalk at students with the gall to not hang on every word he was saying
Yeah, not sarcasm. But contradictory for a free online downloadable.
I do like the next advice, to be able to say the ideas in your own words. Maybe the point (which seems very overstated) is that highlighting and videos create the sensation of knowing, but not necessarily the knowing.
Why do the first chapters of all physics textbook ONLY talk about Greek philosophers and scientists and NEVER talk about contribution to Science from eastern civilization like India and China ? These civilizations have a very rich heritage and astounding contribution to many underlying scientific philosophies. Yet generation after generation of writers ignore them and it is sad to see that being true even today. It is time that this is changed.
Off the top of my head, I can name three things from either China or India that are science-related: the positional numbering system (and zero), the Chinese remainder theorem, and gunpowder. What am I missing that's very important yet rarely mentioned?
But I doubt that many conventional physics textbook will not even mention these and others. The writers are not to blame , but no doubt the ignorance is an unfortunate reality.
“ bosons, quarks and leptons – with their charges and properties make up everything.”
The empty space between the bosons, quarks and leptons is not made up of any of these particles, yet the space ‘exists’ - without it, everything in the universe would just be a big clump.
I think the future of physics is in Energy Wave Theory. Maybe not that theory exactly, but a similar theory going back to the idea of some kind of ether in which waves propagate.
An impossible computer is different from a computer that humankind can not yet build. If our universe is a simulation, then some entity has already proven you wrong. If our universe isn't a simulation, our limited resources prevent us from running (detailed) universe-scale simulations. I'd say that we will eventually be able to simulate a drug interaction from these rules (it's the interaction between these rules that's the key) but I won't predict WHEN that might occur.
No, I’m saying no computer will ever simulate it with only those equations. Higher level equations are needed which cannot be derived from the given ones, they can only be discovered through additional science.
I would take this with a huge grain of salt. But generally it's quite astonishing with how few assumptions such complex theories can be derived. Also the mathematical apparatus has really all batteries included speaking of Lagrange formalism and its Action (point 1)
I mean you have the groups (6. and 7.) from there you get all the variables. E.g. from U(1) you get the generator and the group item. Then add all combinations of the polynomes to L (1.) but skip those that violate invariances like Lorentz (2.), U(1), SU(2), SU(3). You end up with something like L=-1/2 d phi^2 - m phi^2. (Oversimplified) Do the same for SU(2) and SU(3) and you end up with http://nuclear.ucdavis.edu/~tgutierr/files/sml2.pdf For convenience usually some of the variables are then called e (Electron), W+ (W+ Boson) etc.
No, it is fun! The scientists who are looking for exceptions to the standard model and to general relativity since 50 years are having a crisis at present, as often told.
The 9 lines imply all equations of physics. Every Lagrangian of physics is included. The lines are also coherent: none contradicts another. They are complementary: the cover all observations and all fields of physics. No field of physics is left out. The lines are also correct: every calculation fits with observations withing measurement accuracy, since the standard model (with neutrino masses and PMNS mixing) and general relativity exist.
Those 9-lines don't imply all of physics. They don't even mention most of physics. A lesser problem is that they're not correct, either.
For example, Line-5 suggests that entropy is never below the Boltzmann-constant. Which simply isn't true; there're notions of zero-entropy, where zero is less than the Boltzmann-constant.
For another example, Line-2 suggests that nature itself is local; this would contradict non-local effects, e.g. entanglement, and would seem to prohibit faster-than-light recession.
Or, maybe those lines were meant in a way that doesn't have those problems? But maybe they have different problems? Who knows! -- which is the bigger problem.
Of course the 9 lines imply all of physics. Just mention a part of physics that is missing, and I'll buy you a beer.
"Zero entropy" is indeed against the laws of physics - in this universe. It may be different in other universes.
Line 2 does not speak about locality, but about the speed of light. Entanglement does not violate the speed of light - in this universe. It may be different in other universes.
If you know what a Lagrangian is - in quantum theory, in quantum field theory, in the standard model and in general relativity - you also know that there is no random interpretation in the 9 lines.
There's no Lagrangian in those 9-lines. Nor is there Quantum-Theory, nor the Standard-Model, nor General-Relativity. Nor is science there, really.
Which is kinda my point -- those 9-lines aren't all of science.. unless, I guess, if you assume that all of science is a given. But then, why even have 9-lines when 0-lines could do?
Then it's hard to avoid critiques because there're obvious flaws. For example, yes, there're totally productive notions of zero-entropy -- even if not in the models you're used to. For another example, macroscale-constants haven't been demonstrated to emerge from the Standard-Model -- for example, it hasn't been demonstrated that astronomical-scale measurements aren't influenced by unknown factors, such as many-body forces, which might cause results that'd differ from those predicted by the Standard-Model. And since we can measure some of those things not known to emerge from the Standard-Model, the idea that the Standard-Model captures everything -- including those things not known to emerge from it -- doesn't follow.
But then that seems to be getting off-topic, because while there'd seem to be many things off about this, the one I'd really stress is that those 9-lines don't contain what they claim to.
I can see how this is almost offensive to the whole subject of Experimental Physics since it ignores all of it - and for that matter Solid State Physics as well. But for Theoretical Physics (minus Mathematical Physics) it seemed to me almost like a sport where professors tried to boil the theory down to a minimal set of assumptions while taking symmetry arguments to the extreme. Obviously this doesn't contain any QM postulates but OTOH for practical purposes QFT is what many people use to compare theory with predictions. IMHO Theoretical Physics without Experimental Physics is useless and vice-versa. And there isn't much research going on with QM (despite plenty of open questions)
You're assuming free will isn't some complex-looking behavior driven by quantum uncertainty. In the multiverse, every social scientist has many clones who's studies refute their work!
Even if that is so, a quantum description of human behavior is fairly useless in terms of practical application. How does knowledge of the standard model cure the infirm, prevent crime, and so forth?
When we exhibit behaviors. Our context is derived from our visual/auditory field processing from birth. Those waves implant magnetic field signatures on our neurotransmitter morphologies. We are almost explicitly operational in quantum activity.
Many if not all disabilities or health complications are resonant to your mind body homeostasis. Challenge the quantum spectral development process of a person you solve the aforementioned issues.
The natural sciences, the so-called "hard sciences" are really the easy ones, because what they study doesn't have free will. The truly difficult sciences are of human behavior.
> The nine lines contain physics, chemistry, material science, biology, medicine, geology, astronomy, engineering and computer science.
I have doubts that these 9 lines describe computer science. As I interpret it, computer science is founded on pure mathematics and is not a physical or empirical science. Computer science deals with perfect abstract mathematical objects like numbers, sets, quantification, recursion, infinities, etc.
Considering that these lines specify our laws of physics only if you grant that math is a thing, you could argue that all of mathematics gets in for free ;). Sort of like how ZFC can't encode basic logic, but you still need it for axioms to make sense
It's the lowest level. Applying this to, say, medicine is like trying to understand a React app in terms of a Turing Machine. It's involved, but it doesn't help because of all of the extra rules (which are not fundamental properties of the Universe or of mathematics) that we add on top. CORS isn't a fundamental law of the Universe, but all major browsers enforce it, so you have to deal with it.
None of us can discount the possibility that the universe was fine tuned with a specific set of seed parameters which yields a specific trajectory including all of the narrative elements of one person or everyone’s lives.
Most likely the big narrative payoff is somebody winning the claw game for once at the local sports bar. But which of us will it be for?