We see four distinct, different forces in the universe: strong and weak nuclear forces, electromagnetism, and gravity. Our Standard Model predicts that these forces become more unified at higher energy levels:
1. Electromagnetism merges with weak nuclear force at 246 GeV
2. Electroweak force merges with strong nuclear force at around ~10^16 GeV ('grand unification')
3. Finally, quantum mechanics predicts these forces become unified, which is to say indistinguishable, at the Planck energy, around 10^19 GeV.
As we get closer to predicted unification energies, we see different mixes of particles, and the Higgs boson is in fact the particle responsible for electroweak symmetry breaking, with a mass of around 126 GeV.
The problem is that the LHC produces collisions of around 10^4 GeV, so from our current energy scale up to the next unification, with the strong force, we're off by a factor of 10^12.
Back of the envelope estimate is that a supercollider that can reach grand unification energies with our current technology would be around the size of the solar system.
Hence the article: particle smashing is a brute-force approach to investigating new physics, but now there is an extremely wide gulf between what we have discovered, and what we think lies next, hence we need to be more clever than using brute force.
Which is to say: Those unification scales come from speculative models, not from tested physical law. We do not know with any certainty what lies beyond the energy scales we've tested. You should take with a generous grain of salt any claims that we have physical models that work across an additional 14 orders of magnitude. Consider the range of phenomena that we've discovered in moving from the ~1 cm scale of a glass of water to the 10^-14 cm scale where quantum chromodynamics kicks in.
I was at a seminar of Alexander Polyakov, one of the founders of string theory, who compared the string theory to the knowledge of Democritus about atoms. He said that even though the atomic theory of Democritus was essentially correct, it was about 2,000 years ahead of any possible experimental verification. In his opinion this is how long it will take us to verify our Grand unified theory experimentally.
I didn't not understand your statement about EH action: the cosmological term there has a dimensionless coefficient of the order 1. Are you talking about some effective action, arising from a string theory model?
At the planck energy scale, this is no longer the case, the relativistic mass of particles becomes so big that gravitational interaction is too strong to be ignored, and the SM loses it's ability to predict how particles interact, decay, combine etc
So you're guaranteed to then see 'new physics'.
It's not relativistic mass that's the key factor: relativistic mass is frame dependent, and it's not the source of gravity. The relevant factor is stress-energy: energy density, momentum density, pressure, and other stresses. The key factor at the planck energy scale is that the density of stress-energy is high enough that we can no longer have confidence that classical General Relativity is an accurate description of gravity; we expect to see quantum gravity phenomena at that stress-energy density.
The stress-energy tensor does have a momentum component, but remember that all components of a tensor are frame-dependent. So is "gravitational pull". Obviously the trajectories of two particles passing each other at relativistic speeds will be different from the trajectories of two particles at are at rest relative to each other at some instant; but the difference is not quite as simple as "more gravitational pull", although the two particles having relativistic velocities does mean that the center of mass energy of the system is larger than it would be if both particles started out at rest.
(Actually, the electromagnetic interaction between electrons is so much stronger than the gravitational that the gravitational effects are negligible in the scenario as you state it; but we could eliminate that issue by considering, say, two neutrons instead. My comments above assume that the scenario has been modified accordingly, which is why I said "particles" instead of "electrons".)
Still way too hard for us but also way easier than building a tube going around the solar system.
There's another issue though.
The reason that accelerators are circular is so that you can keep applying an electric field to accelerate the particles. The catch is, that when you accelerate a charged particle, it emits a bit of radiation, and this radiation increases with the amout of acceleration it's undergoing. The problem is that while you can keep nudging the particles faster and faster tangentially to the track, the circular track requires you to increasingly pull the particles back to the center of the track (i.e. apply a greater centripetal force).
This means that you increasingly lose energy what is called 'synchrotron radiation'. This means that you have a maximum speed (or energy) that you can accelerate particles to with a given radius. Hence if you want more energetic particles in your collisions, you need bigger colliders.
So if you had just a bunch of point to point satellites, you'd lose the benefit of having a greater radius (which doesn't actually apply anymore because you're not doing continuous acceleration anyhow), because you have to increase the amount of acceleration you have to apply at each satellite to pass it to the next one.
Presumably, you want to actually see the results of your collision.
No. Not at all. You can expect particle density to be anywhere from 5cm^-3 to 80cm^-3 if you stay within 1 astronomical unit distance from the Sun. Besides, all that plasma is accompanied by frozen-in magnetic field, plus you have energetic particle events, recurrent fast solar wind emerging from coronal holes and plethora of other effects and structures which are terribly interesting to study but which will ruin your high energy physics experiment even if you somehow acquire god-like powers and build a huge non-contiguous accelerator.
It's possible that forward progress in the post-LHC era will come from better sensing technology, rather than larger accelerators.
The difference is that space is filled with lots of other stuff that makes running experiments less than ideal.
had to look this up to see what you mean. I think you mean 5g(cm^-3) to 80g(cm^-3)
i'm not sure why the downvotes were necessary though when I made it clear I was trying to figure out what he's saying.
To bend the particle beam you need magnetic fields. Particles with more energy need larger fields. Larger radius means less field strength for given energy or larger energy for given field strength.
So a circular accelerator needs strong magnets nearly everywhere to force the beam on a circular path. In case you would put it in outer space, it's not like you just bounce the beam between some small number of spacecraft, that wouldn't gain much if at all, you would need something very similar to the structure in the LHC.
Also maybe the nodes could have some kind of magnetic ram scoop design to cover a larger area and funnel the particles through.
I think it's more like GPS orbits are precisely known, not so precisely controlled. If you have fixed ground stations the GPS sats can accurately figure out where they are (like GPS in reverse). If the GPS sats can transmit their location and time accurately, your receiver can figure out your position accurately. The only thing that needs to be well controlled in this is the ground stations locations, which are not moving.
One of the research projects to help with this, actually in effect to have carefully controlled orbits, was a "drag free" satellite. Basically just put the satellite in a ball. Then the ball still has drag, but the satellite does not. Sure, soon as the satellite without drag moves with respect to the ball that does, have to tweak the orbit of the ball to keep the drag free satellite at the center of the ball.
Source for this? It was my understanding they weren’t, hence why the almanac data is so important, but I’m not positive that assumption is valid so I’d love a source to back up your claim.
Sounds possibly like an invalid analogy, but in fact there's substantial overlap in the theories. If you want to read more try reading wikipedia articles on "ising model", "spontaneous symmetry breaking", "higgs mechanism".
Before Faraday's experiments, we understood static electricity (charged objects attract/repel charged objects of the opposite/like sign, proportionally to the product of the charge, and falling off with the square of the distance), and magnetism (magnetic poles repel).
Once we saw that moving electric charges generate magnetic fields and vice versa, we opened the door to realizing that these are both just aspects of the same electromagnetic field.
Do experimental (!) particle physicists have any other way of doing research on this than colliders?
The major issue is that you have nearly no idea when/where these events will occur, and they are really rare to begin with. Add the distances that these things occur at and your error bars go through the roof. Still, over a VERY long time and with a LOT of these detectors spread about the Sol system (and likely a bit further out), you can get the results you'd need.
It's not cheap or easy or timely. But with 12 orders of magnitude to go, it's a lot cheaper, easier, and more timely than trying to build an accelerator with Jupiter in the way :)
 as in, not next to the Sun, Earth, Moon, near gravitational perturbers, etc.
But not sure if that qualifies as "experimental" particle physics.
Apologies if I am way off base
Kind of like how we create magnetic force with electric.
Particle accelerators have been the mainstay and principle tool for particle physics, and have produced amazing results. Further, there are obvious questions in fundamental physics that the Standard Model revealed by these experiments does not answer. But, it does not follow these further questions can be answered via accelerators that us humans can construct. The universe has made no promise to us that its mysteries are accessible, least of all through some particular method. We had a particular reason to believe the LHC would reveal new physics (the Higgs boson prediction) and we have no such reason for the proposed future accelerator; our evaluation should change as a result.
My guess would be that astrophysics is a better route to understanding the mysteries of fundamental physics; astronomers can measure distant phenomena with surprising accuracy already, and objects such as black holes are one of the few cases where exotic phenomena are currently widely expected to occur. Perhaps the money is better spent on a truly gigantic space telescope?
And the JWST is going to give us insights about galaxy formation, exoplanets, and more!
"In the next ten years, the most important discovery in high-energy physics is that `the party's over'."
Frank Yang, 1980
Michelson (of Michelson-Morley fame), apparently, IIRC. I'd warrant that despite his credentials Yang (of Yang-Mills fame) is making a similar mistake.
But I do think we're likely to see a retarding of the pace of change in fundamental physics (I think we're on to a reflective phase, where science consolidates, and social change progresses; leading to more focus on higher-order sciences - biology and such).
Hell, invest a few billion in building a true quantum computer or working towards true AI (just to name two areas where there are many more puzzles and obvious directions to explore) and see how much more impact that has and how much faster you can solve those same physics puzzles.
And while I do agree that much more money should be spent on Science, I don't think particle physics is the right area in which to invest. We can make much much more progress in other fields.
”New largest number factored on a quantum device is 56,153”
Still something you can do in your head in less time than reading the paper takes, but more than 15.
The point of this is that 'something' is very wrong, and nobody really knows what it is. And this something could be something incredibly fundamental. If it turns out dark matter simply does not exist then we'll 'set physics back' (it would be progress, but the effect would be the same) many decades, but also open the door to an explosion of new knowledge and exploration.
Then there are question like why is the universe accelerating? As you mention the bandaid here is dark energy, but again A cannot be separated from B. There are even open questions about things such as the big bang. Why are things like the cosmic microwave background radiation near homogeneous, even in areas of space that should not be causally connected (light has not had time to go from A to B since the birth of the universe)? One explanation for this is cosmic inflation which in turn created it's own dark matter analog - the 'inflaton' that's even more bizarre than dark matter, but once again it's just a bandaid to try to explain something that goes against everything we'd otherwise expect.
There's an immense amount of room for progress in physics by removing these bandaids - either by testable physical explanations, or by providing better bandaids. And this is also just scratching the surface. There's so much we have no clue about. Even basic observations remain inexplicable, fast radio bursts, the rapid dimming of stars such as KIC8462852, and so on.
I would speculate that there are probably more 'known unknowns' in physics today than ever before. Of course all of the questions seem incredibly difficult to answer, but this is what it always looks like when you're at any given state in science. We only see the answers as obvious or easy once we already know what they are. And then the next questions are the ones that suddenly take the place of seeming incredibly difficult to answer, only for this pattern to repeat to seemingly no end.
 - https://phys.org/news/2016-09-spiral-irregular-galaxies-curr...
This view probably developed because the realist models of quantum mechanics were too difficult for people to accept. It was easier to say that intuitive models are outside the scope of science than to change their set of inconsistent preconceptions.
When you say "intuitive models" do you mean ones that echo somewhat Newtonian/Classical physics?
Perhaps part of the problem is that schools follow a method of teaching physics where they teach what is easy to teach, knowing it's wrong, and then teach something a little more complex, knowing that's wrong too, etc.. Perhaps we need to start off teaching that the world is non-Newtonian, that it has Quantum weirdness and relativistic effects (according to our best models), and allow that to underlie our intuitions.
The situation has certainly improved a lot since the days of Everett, who was basically dismissed as a crack-pot.
My statement was based on an IoP survey result I saw some years ago (I'm a theoretical physics graduate); I'll try and dig it up as my recollection could be wrong.
There have been a few surveys over the years, but mostly of conference participants. The results depend heavily on which conference you attend.
Yes, much research goes into string theory, but its predictions are "not even wrong", they cannot be proven false by experiment .
Disclaimer: I'm a layman on the topic, this is just repeating what I've heard.
Please try to be at least a little bit skeptical when a handful of naysayers claim that an entire field's worth of very smart people are all spending their careers on pointless work. Those naysayers could be right, obviously! (Even as experts we have to recognize the possibility, and to the many laymen looking on from outside it's even harder to judge.)
But if you read Peter Woit's dismissive rants and accept them as gospel truth, just remember that in doing that you're choosing to believe that hundreds (thousands?) of dedicated, passionate, deeply knowledgeable physicists are either painfully deluded or deliberately running some sort of collective scam. You're choosing to believe that either none of us have bothered to read Woit's objections or that we've all failed to accept them even though they're right.
I'm not sure what our motivation for that is supposed to be. I promise you: people who go into fundamental physics aren't in it because it's an easy road to a steady paycheck! We do this because we're driven to understand the building blocks of the universe. If I were to become convinced that my current approach to trying to understand the universe was fundamentally a dead end, I'd be wasting my life if I continued to study it! I'd be throwing away everything I've trained for, everything in science that I care about. And I have no idea what Woit thinks I'm getting in return.
So your call. Either Woit's right and we're all knowingly wasting our lives, or the situation isn't as cut and dried as Woit claims it is.
One, the claim that string theory makes no testable predictions is just plain false. The predictions are just out of reach of current technology.
Two, we should probably be more suspicious of explanations that rely on folk (interpretations of) psychological effects. How strong is the Sunken Cost fallacy? Is it sufficient to overcome the motivation to shift subfields? At what rate should we expect theorists transisioning out of String Theory, under the assumption that Woit is right?
Three, String Theory has been very useful already. It's development has provided us with novel tools for mathematical tools for studying a plethora of applied quantum phenomena.
Disclaimer: I'm not a String Theorist.
String theorists may not be vocal in countering the objections raised, or they may simply consider getting into a debate a waste of time. But the fact that hundreds of smart people are choosing to spend time on something is _some_ evidence that the project is not completely misguided.
Of course, the positive evidence has to be balanced with the evidence of group thinking and sunk cost fallacy.
Actually, we are now going from a boring phase of particle physics where the theory was able to predict anything we were able to measure afterwards to a phase where no theorists has a clue what might happen.
This phase is called "exploration". It consists of many "blue shots" of which many will probably have zero results but have to be done in order to find out what is really going on. Theorists had a good run with prediction, now it's the experimentalists turn to lead the way with exploration by producing new data.
When the Muon (basically a heavy version of the electron) was first discovered it 'seemed so incongruous and surprising at the time, that Nobel laureate I. I. Rabi famously quipped, "Who ordered that?"'[Wikipedia]
The new CERN collider has got to be build. To risk to miss the next great "Who ordered that?" moment would be grossly inconsequential.
There are many, many, MANY papers that fly out of these particle physics experiments. Most of the papers are about how they do not find stuff.
These papers take the form: "Search for the pair production of X in proton-proton collisions at s√ = 13 TeV" or other more generic papers where they just look for anomalies against Standard Model predictions. Fun fact - nothing found yet! This was the case at the Tevatron, where countless experimentalists poured over the data trying to find the slightest hint of something new. It also is the case today (so far) with the LHC.
To suggest that we're in a new phase where we're going from a "boring" phase to a new one where theory doesn't know kinda misses the point that experimentalists have already been doing this.
Indeed. The question is why should they stop doing this? Because it is not satisfying enough? Because it won't get you a Nobel Price?
It is the only method to bring light to dark. It needs to be done even though the work might appear tedious.
While I am a theoretical physicist by training, I have no doubt that those same billions could be much better used by many other fields such as biology, chemistry, computer science, ai, etc...
In my phd thesis I had to partly rebuild an experiment that was done 35 years ago. (We were not able to improve the systematical errors. The statistics was vastly improved due to modern electronics though). All people who were involved in the old experiment were either retired or dead. The practical problems we encountered were "theoretical trivial" but plenty and time consuming (e.g. finding the right glue with has the right optical density and which does not dissolve the radiation hardened wavelength shifters). If we had just one person to talk to who had done the old experiment we could have done it in 1/4th time or better.
If we're talking about that being the best argument in the real world for building accelerators at the moment, let's spend $200 million on a project to archive every conceivable piece of information related to building an accelerator and save 99% of our budget.
But even if you document everything many essential things will be gone. Especially knowledge about things that have been tried but don't work because failures are usually not published or documented.
It is a little bit like having a document with runes of an ancient language where you may be can find out the meaning of the words and sentences. But you will never find out how the language actually sounded like.
It's not just about bolting, gluing, and connecting things you can buy to a few things you have to make. It's cutting-edge applied engineering with some of the closest tolerances you'll find anywhere, supported with post-doc math and often with similarly cutting-edge software development.
It gets a lot less attention than theoretical physics, which is a shame because there wouldn't any theoretical physics without it.
> I'm all for investing in cutting edge engineering to build something new.
If you asked oven engineers 70 years ago to build the most efficient oven they can they probably would have build something that is 2 or 3 percent better than what was currently on the market. But nobody of those oven engineers would have build a microwave.
Hence, in order to solve the engineering problems of tomorrow we have to do blue shot fundamental research today because the solution to many problems is not always obvious.
That expertise can be preserved to a large degree. I'm reminded of a story I read about the F-22. Lockheed shut down the production line, but as they did so, they extensively documented every procedure -- to the point of making DVDs of someone performing each production step -- in case they ever needed to restart it again.
I'd rather attempt something like that than fund even more massive make work programs just to keep the engineering knowledge alive.
There was a paper a while back about Google's software engineering practices, and one of the interesting points that I didn't see discussed much was that "most software at Google gets rewritten every few years" . The paper includes some justifications, but I think one of the biggest reasons is that it's very hard to institutionalize the sort of knowledge that you get by building something yourself. For the same reasons, I think Lockheed's documentation effort might be sufficient to build an F-22 identical to what they do now, but wouldn't be sufficient to design a new plane inspired by the F-22; and similarly knowing exactly how to build an LHC wouldn't be sufficient to build the next big accelerator.
: https://arxiv.org/ftp/arxiv/papers/1702/1702.01715.pdf (see section 2.11)
I think that's true, but that fact doesn't really lend much support the idea of building another large accelerator now rather than in a few generations time. Maybe this generation has the time to build another big accelerator using its existing experience, but why build one at all if there aren't clear goals for what it will accomplish and the experience will be lost regardless? All you'd be doing is spending money to spread out the re-learning process.
process and operational knowledge is the most valuable form of knowledge we possess. It is the only form of knowledge essentially only obtainable through practice, and we're always only one generation away from losing it.
We can witness the huge cost of rediscovering this knowledge in space exploration right now.
Has there been any speculative fiction exploring what a fiat economic system would look like if instead of bankers placing bets on various business ventures, we had scientists and engineers describing, we need to rule out (or in) this, this, and that possibility to move the boundaries of human knowledge forward, and let the entities who have first crack at fiat economic issuance that eventually flows outward into the "real" economy be these science and engineering efforts?
Future Circular Collider Study, Volume 1 - Physics Opportunities, Conceptual Design Report, PREPRINT submitted to Eur. Phys. J. C, 20 December 2018
It's almost 200 pages about the "Physics Opportunities" of the next collider.
Yes, a larger collider is unlikely to find much, but it's cheap enough that we should do it anyway. As bets go, it's a good one.
What's your point anyways? We should only consider the cost and environmental impact of something if it's comparable to the most costly and impactful endeavors out there?
In 2018 the NSF requested $6.653 billion. (More than the stupid wall--don't tell Trump.)
That's some real money, and it should be well-spent. A new collider is probably not well-spent.
I remember ... early 90s ... when the Superconducting Super Collider was being proposed/started in Waxahachie Texas. The original claim to the broader scientific community funded by NSF, DOE, and DOD was that building this wouldn't impact other science funding.
Then my thesis advisors grant was reduced as part of cost savings to move money around for the SSC.
So ... far from being a good expenditure, real science was cut to make room for a project that ultimately was shut down. Hit me directly, as I couldn't take a research assistant position with my advisor, I had to take a teaching position to provide me income and tuition support.
These were not fun times.
Sabine's article is quite good, and she asked a meaningful set of questions. A new collider is probably not the best use of funds ... though ... honestly ... I'd like to see a helluva lot more money pushed to real science, so we can build the infrastructure (non-retired) scientists need, fund the software they need to develop.
Doubling or trebling NSF budget would help. Similarly for NIH, CDC, and others.
Not that I think we should repeat the funding mistakes of the past (Ph.D. in physics in 90's, think 1000 applicants for each open tenure track position, and 100's of applicants for each national lab position). We should make sure we are doing quality work, and enable researchers to take risks. Current grant process doesn't really allow this.
Still, it is a bit weird to get upset about the Future circular collider when the International Linear Collider is the next big High energy physics project.
Speaking of someone who has been on the receiving end of lots of research money throughout my career, still...$6 billion is a lot of money, and it should not be wasted. I agree with the original article that a new particle collider probably is not what we should do with this money.
 (and yes I agree it's too much--go fight that battle if you like, but you probably won't win a single cent back)
The military's budget for Research, Development, Testing, and Evaluation in 2016 was 69B, or 11.9% of the total budget.
sure there will be research, and some of it will be open, but all the research behind closed doors does not enjoy the scrutiny of normal research, so there is plenty of opportunity to waste money on nonsense "research" ...
Just like I'm sure 20 years ago a market researcher was convinced he was doing "pure" research...
It seems the field has advanced so far that EVEN IF new discoveries emerge, they would be of no practical value.
I'm not saying it's not interesting (I like reading about it FWIW), and I'm not saying it will never ever prove useful. But from resource allocation perspective, tens of billions of dollars required for high energy experiments seem to be much better spent on other areas of physics. That is, until the civilization advances far enough that understanding the depths of particle physics or cosmology becomes relevant.
Particle physicists would disagree. A complete understanding of quantum mechanics, squared with general relativity, is very likely to have practical applications.
Cliche analogy: if you thought the world was flat and ships were falling off the end of the ocean, you might be investing your money in world-edge-detection; and you might say there's no point in studying the edge of the world itself because no practical value can come of it. You wouldn't have any idea that circumnavigation was (relatively) easy once you understood more about the nature of the world.
But anything observable in the world of reasonable energies (up to whatever modern colliders achieve) can be predicted with existing theories.
If we discover something that only happens at astronomically high energies, would it really be that useful?
This is classic "world is flat" / geocentric argument. Breakthroughs are impossible to predict but can't happen if we give up.
> If we discover something that only happens at astronomically high energies, would it really be that useful?
We don't know. World being round turned out to be pretty useful.
We're not even sure if there's anything in those higher energy ranges, there are certainly knowledge gaps, but maybe a bigger collider is not the answer. At least not until we have some other hints and new theoretical ideas.
If we're hitting a wall with respect to increasing energy in a system as much as possible, maybe we should make sure we've picked all of the low-hanging fruit from decreasing energy in a system as much as possible.
A few examples: QFT and GR are not integrated, dark matter, dark energy, the vacuum catastrophe, and so on...
One could make the argument that a larger collider is not the best way to attack these (I have no idea), but that is a different statement.
If the funding for your field is predicated on building a larger collider to discover new particles; then you're in trouble. That is the sense 'Particle Physics may be done' is meant in this context.
He also didn't include quarks, so by the same logic, the "standard model" is inconsistent with those too.
The supersolid dark matter displaced by a galaxy pushes back, causing the stars in the outer arms of the galaxy to orbit the galactic center at the rate in which they do.
Displaced supersolid dark matter is curved spacetime.
1) What exactly do you mean by a supersolid?
2) How would this be described mathematically? What predictions would your theory make?
3) How does this account for some galaxies seeming to have large amounts of dark matter, and others seeming to have less?
4) What predictions does your theory make with regards to the expansion of the universe?
5) What predictions does this theory have with regards to places with extreme gravity, such as neutron stars and black holes?
yeah but at what cost? how many more billions of taxpayer money need to be poured into this only to further push the theoretical constraints of our parameter space?
It's all about opportunity cost.
However the next-gen gravitational wave observatories should be able to start poking at quantum gravity, from what I understand.
I believe little additional monies should be directed to more powerful particle accelerators until compelling evidence of something worth pursuing comes from theoretical physics. Meanwhile there are many problems where the investment will provide a predictable and useful return.
This is what experiments do.
At this time I oppose further pursuit of ever-bigger accelerators which draw interest and money away from other more productive areas of science.
A priceless sense of wonder.
--Albert Michelson 1894
In other words, having already determined in broad strokes how the mechanics of the world work(conservation of mass, conservation of energy, thermodynamics, etc.) it now falls to us to make increasingly precise measurements of physical systems. Assuming that these laws always apply, we then need to figure out why our precise measurements do not match up with the predictions of our science.
He's not talking about the end of science at all, and it's pretty uncharitable to think Michelson would ever say such a thing.
2. Gravitational Waves
You probably mean Narendra Modi Waves :-)
4. Accelerated expansion of the Universe (NP 2011)
- Kelvin, 1900
Prof. Stephen Hawking mentions this in general context that, 'they thought physics was a closed subject in early 20th century' & proceeds to state how the universe was unfolded in theoretical physics in 'Brief Answers to the Big Questions'.
As far the article is concerned, I didn't quote to suggest they say it is end of particle physics. I just didn't agree with,
> But with the Higgs found, the next larger collider has no good motivation
Theoretical physicists were in consensus that to find precise answers w.r.t what happened during Big Bang we need high energy collider of the size of our solar system, that didn't deter them from creating LHC.
 : http://scienceworld.wolfram.com/biography/Kelvin.html
But they also stretch out an elementary particle to gargantuan proportions in order to write out a circuit board on the surface, so I don't know if its really a bastion of hard science fiction. It's closer to fantasy about sciencey stuff.
Only if they are willing to believe that entanglement enables superluminal communication and that protons can be "unfolded" to macroscopic branes.
Greg Egan, for instance, has several stories that can only be described as quite hard, but are based around entirely different physics, not merely tweaks to ours, or are based around particular models of black holes that can't be proved, etc.
I've dipped back into reading SF after a long time of only reading the classics (Heinlein et al), and the Three-Body Problem, which I read a few days ago, did seem to me to fit into that mould.
But what are some other modern "hard SF" books you've enjoyed?
All great, but I'm more old-school, William Gibson, James P. Hogan, Greg Bear (Eon series), Mary Doria Russell (The Sparrow).
Which is why direct fusion of hydrogen/deuterium will never work at expected temperatures and controllable speed, it can only work in explosions, if at all.
First of all, we can do/have done fusion on earth (from H-bombs, which are quite calculable, to ICF, also quite calculable).
Second, proton capture for heavy nuclei is endothermic (so you cool down) and there is no way "cycling" can produce energy since that would be a perpetual energy source. That's why fusion stops in iron rich stars (fusion is exothermic until you reach Fe-56). We have A LOT of astro observations to back this up.
The heavy elements that do get produced are generally produced by neutron capture, not proton capture (r-process). This is an endothermic process and happens during large energy releases such as neutron star mergers and potentially core collapse supernovae (although there still needs to be more experimental evidence).
It doesn't seem to be working, so it's reasonable to expect it's wrong.
>Second, proton capture for heavy nuclei is endothermic (so you cool down)
No, it isn't. Whoever calculated that Iron is where it ends forgot to add the binding energy of the added proton. The binding energy per nucleon decreases, but there is one nucleon more, so it's still a net gain.
> there is no way "cycling" can produce energy since that would be a perpetual energy source.
It wouldn't you're using up hydrogen.
>We have A LOT of astro observations to back this up.
There are also observations of lead rich stars.