Then the supercollider was canceled, and later we called on CERN in Geneva to brief them on storage systems software and hardware, like hierarchical mass storage and linear tape robots. CERN had a nice mockup of the proposed LHC, where you climbed down a ladder into a fake tube, like what is bored underground. While we were down there, our hosts shared a shocking factoid with us. The US government canceled the proposed $12 billion supercollider project, and to exit all the contracts and fill in the holes that had already been dug was costing $650 million. CERN told us that for $650 million, they could build the LHC. I didn't verify their numbers, but their capital efficiency was stunning.
CERN is funded by all the European countries with a steady budget, and they are allowed to spend it however they wish. When CERN needs to build something big, they put some part of their steady funding into the bank, and it just sits their for however long it takes until they need it. By having steady, stable funding, they can make much more efficient use of their funds. As far as I can tell, there is little or no political heat about CERN's budget like "what have you done for us lately?"
The US, on the other hand, funds large projects out of special legislation in Congress, and everyone has to get their constituents a piece of the pie. This has some gross inefficiencies for large science projects. Appropriate hype is the motive force.
You are exactly right on this. I was a physics grad student (not particles) when the Supercollider was being pitched. We had a colloquium on it, and the presenter kept repeating the slogan "5 years, 5 billion dollars." Most of us were savvy enough to know that such a claim was pure bullshit. But hey, if you convince them to start it they would never cancel it, right? And if you needed more money they would have to give it to you, right?
The marginal cost of one additional shuttle launch isn't the same as the per-launch average across the whole program; the first launch was incredibly expensive (presumably in the tens of billions), and subsequent launches were less expensive. NASA claims $450M per launch at the end of the program.
I got to see part of their warehousing up near Lansing, MI. Imagine a building the size of a jumbo jet hangar, packed to the rafters with NOTHING but boxes on boxes of J Cups. Now imagine another. I'm pretty sure the Lansing facility had at least two, and that still wasn't enough for the millions of cups they move each year.
We use a lot of cups.
The U.S.'s greatest military victory--the last time we can cleanly call ourselves heroes--was World War II. The Nazis were an awful regime who did horrific things that no one can defend. And Japan directly attacked us. We had good reasons for fighting and we (with our allies) conclusively won.
And there is broad public sentiment that we won because of physics. High-end theoretical physics gave us futuristic tools like radio, radar, and of course the nuclear bomb. Lower-end physics gave us the tools for engineering the incredible machines we fought with, like airplanes, bombs, tanks, and ships.
So, in minds of U.S. citizens, and more importantly in the halls of U.S. government, discussions of high-end physics come with an implicit promise of military applications. Maybe we could figure out anti-gravity, people think, or ray guns, or teleportation, or force fields--if we only understood the particles and fields a bit better.
Physicists do not promise any of this, of course. But at least IMO it is a real phenomenon. I went to see Interstellar with someone who had worked in and with Congress for a long time. After we left, she said "do you think it's true that once we understand gravity, we'll be able to manage gravity and create antigravity?" I had to explain that just understanding a phenomenon does not grant magical powers over it.
But that has been the experience of the U.S. government! They gave money so physicists could better understand particles, and in return the scientists gave the government seemingly magical powers, like seeing in the dark (radar) and city-destroying explosions (nuclear fission and fusion).
So what happens when physics stops delivering military leaps forward? Or when the physics is superseded by another discipline that delivers military applications?
It looked like chemistry and biology might do that, but then the world managed to collectively decide that those should be illegal tools of war. But it seems totally obvious that the current top priority for military application is information technology.
So, I think the author is correct that particle physics faces a looming crisis, at least in funding and public confidence.
Particle physics winter, just like AI in the 80s . (It's a weird coincidence/irony that this is happening just as AI is ascendent and delivering significant military applications.)
Where do you think they are going to?
Is that an American thing?
> And there is broad public sentiment that we won because of physics. High-end theoretical physics gave us futuristic tools like radio, radar, and of course the nuclear bomb. Lower-end physics gave us the tools for engineering the incredible machines we fought with, like airplanes, bombs, tanks, and ships.
> So, in minds of U.S. citizens, and more importantly in the halls of U.S. government, discussions of high-end physics come with an implicit promise of military applications. Maybe we could figure out anti-gravity, people think, or ray guns, or teleportation, or force fields--if we only understood the particles and fields a bit better.
What is your opinion on the counter-thesis that the reason rather was the Sputnik crisis?
There are many citations in the article about the US/UK strategy.
The US oil embargo on the Japanese homeland had put them on their back foot before the war even started - and is often cited as a reason for the undelivered Japanese declaration of war prior to Pearl Harbor, and once we starting island hopping/pushing them off the Asian mainland they could only delay defeat and to make it as painful as possible for America to progress to discourage an invasion of Japan. Part of Germany's impetus for attacking on the Eastern front was the need for more fuel, the Allies had more in their homelands of the USA and the Soviet Union than was available to Germany.
The somewhat disputed Yamamoto quote says it best(the movie version probably is what sticks in people's minds):
The Soviets suffered more casualties than anybody to stop the Germans on the Eastern front and tying them up there helped reduce the German troops available for their Middle East push and beach/support defenses in France and Italy and the Soviets were able to build much of their own military hardware even after the German invasion but they still got our help, mainly for transport of material. We helped almost all the Allies, but most support went to the UK and Soviet Union.
Kruschev quotes Stalin here:
"He stated bluntly that if the United States had not helped us, we would not have won the war. If we had had to fight Nazi Germany one on one, we could not have stood up against Germany's pressure, and we would have lost the war."
But i'm wondering if there isn't a more menacing threat looming for these advanced sciences. We could actually be hitting a hard wall in our understanding of nature. We are using instruments created for us in our dimension to probe worlds that might be much too small (or large) for us to make any serious sense of, at least for the next millennium (maybe). Maybe it's time we accept that we can't understand everything, only a little more at the time, maybe (probably) in infinity.
Meanwhile, there are a vast number of insights pertinent to the human scale waiting to be found.
It just means "numbers involved in physics are short and simple". From wikipedia:
> In physics, naturalness is the property that the dimensionless ratios between free parameters or physical constants appearing in a physical theory should take values "of order 1" and that free parameters are not fine-tuned. That is, a natural theory would have parameter ratios with values like 2.34 rather than 234000 or 0.000234.
And the reason physicists tend to expect their theories to be natural is that in the history of physics uncovering new things, most things have been natural, to the point where significant amounts of knowledge has been derived based on the expectation of naturalness -- that is, when something could be natural or not, and we couldn't properly test it yet, we just assumed it was natural and went on to research other things. And then, possibly decades later, when it's finally possible to test the theory, the test just confirms that yes, it was natural.
There is good reason to believe that biochem would only have hurt western interests if it was researched. Countries with nukes don't need weapons like this to wage war.
This isn't to say that modern philosophers aren't susceptible to alluring desert landscapes like "naturalness", but at least philosophers are trained to think about and critique these kinds of things. Physics needs to be capable of having this debating itself and recognise assumptions with wobbly metaphysical underpinnings.
In particular, there's a long-accumulating need for a revision of the current dogma on the philosophy of science and its operationalization -- Popper falsificationism, peer review, publication-oriented research, freaking null hypothesis...
Most of our current science has happened before Popper and falsificationism, so there is nothing "essentially true" about the current M.O. of science. And heck, was it falsificationism that brought us vaccines, nuclear energy and computers? Because there's a replication crisis going on in many scientific fields, and particle physics doesn't fit the mold of falsificationism at all.
I'm not coming forward with a solution in an HN comment, but I think we need to stop equating science, the civilizational project, with this specific philosophy of science and social system for organizing science.
Operationally? You have to pick a null hypothesis. You have to carve reality in two, specify two possible universes and ask an experiment to tell you which universe where you're in.
Darwinism (the one in Darwin's works) isn't like this. Adhesion to falsificationism would have nipped that one in the bud.
Stuff like this is the way forward: https://en.wikipedia.org/wiki/Confirmation_holism
This is antithetical to science. If you're promising a breakthrough discovery, you're approaching the experiment with bias.
The fact that more new particles have not emerged at energy levels the LHC can produce is a discovery--if I'm understanding the blog post correctly, it's the beginnings of a disproof of naturalness in supersymmmetry. It's not as exciting as if they had discovered hundreds of new things to study, but it's equally important.
And that's exactly why I agree with the author: science is about finding what's true not about finding what's exciting. As a taxpayer, I think one of the most valuable things particle physics could do here is to educate people on that bias and lead by example. I get that they fear losing their funding to do science, but if you let that fear push you into pursuing exciting results over the truth, then you're not doing science anyway.
 I'm not a particle physicist--my post is about the social problem that physicists are facing, not about the physics.
I think all experiments are began with some premonition of what to expect. For example Michaelson and Morley very much expected to measure the speed through which earth would pass through aether ... except they couldn't. An the results pushed physics toward theory of relativity.
I disagree that's it's an entirely bad process to bankroll experiments based on unproven promises. This is exactly how the Manhattan Project happened. The physicists promised that it was very likely they could create a very large explosion, but they did not know if it would bang or fizzle. So, the US government began a huge industrial scale operation to enrich uranium and to assemble the bomb. The first atomic bomb explosion was very much empirical science that was bankrolled by "unsound" promises.
In this sense going beyond LHC would be kinda ground breaking - big budget science with absolutely no clue on what to expect. It's how discoveries are made, yes, but I'm not sure if any large scale scientific project has been funded without at least some clue on what to expect.
> I disagree that's it's an entirely bad process to bankroll experiments based on unproven promises. This is exactly how the Manhattan Project happened. The physicists promised that it was very likely they could create a very large explosion, but they did not know if it would bang or fizzle. So, the US government began a huge industrial scale operation to enrich uranium and to assemble the bomb. The first atomic bomb explosion was very much empirical science that was bankrolled by "unsound" promises.
What you're describing is the "hypothesis" step in the scientific process. Properly done, a hypothesis isn't a promise--it's simply a statement of the possibility you're testing, without any commitment to the possibility being the reality or not.
A good hypothesis results in the same experiment as its negative: "There are more supersymmetric particles at higher energies" is the same hypothesis as "There are not more supersymmetric particles at higher energies" because you test both hypotheses in the same way. Contrast this with a promise: you can't promise something and its opposite.
> In this sense going beyond LHC would be kinda ground breaking - big budget science with absolutely no clue on what to expect. It's how discoveries are made, yes, but I'm not sure if any large scale scientific project has been funded without at least some clue on what to expect.
I don't think we have absolutely no clue what to expect--the article goes into some of the possibilities.
Actually, no, they are not the same. You are excluding the middle, as it is necessary for an experiment to disprove the null hypothesis before any conclusion can be made. Just because you don't prove your hypothesis doesn't mean you prove its negation. Typically, an experiment will find no result at all.
The word "promise" doesn't just mean that something will definitely happen, it has a secondary meaning of something being promising, having "the quality of potential excellence".
It's about articulating where this experiment slots into the context, articulating why it's interesting to look at this thing, and not the fifteen other things that won't be funded if your thing does.
Promise and Promising do not have the same meaning.
promise = statement that something will happen
promising = showing signs of future success
And no, promise can mean exactly what I said it can mean. https://en.oxforddictionaries.com/definition/promise
From the article:
> To justify substantial investments, I am told, an experiment needs a clear goal and at least a promise of breakthrough discoveries
That sentence is meaningless if "promise" means "A declaration or assurance that one will do something or that a particular thing will happen" -- the "at least" is totally redundant in that interpretation. If it means "the quality of potential excellence", then "at least" makes perfect sense.
Again, use the word promise here with its other meaning and it makes perfect sense. Both framed questions have the promise of revealing something big. They are not a guarantee of a big result but there is the possibility of a big result.
Once there is money on the line, the concept of "without any commitment" goes out the window. You are committing money to testing that hypothesis and there is an opportunity cost for other more promising hypotheses you could instead test with that same money.
Saying, "I think the particle collider X will demonstrate the existence of the Higgs boson" (or whatever) is a simple hypothesis.
Saying, "I think you should give me $9 billion to build particle collider X that will demonstrate the existence of the Higgs boson" is a much different statement that requires more sophisticated analysis before smart action can be taken.
That's not to say that there isn't also a place for funding specific research that shows promise for solving specific problems or that would provide specific benefits - the Manhattan Project is certainly an example of this.
> At the end of 2018, the LHC will have recorded a mere 3% of the intended research program. That means that there is 30x more data to come. I think you'd need to see the results of all of the data before you say that the LHC was a bust. It may be. But your claim is hasty.
The difference is that they predicted a large explosion before, now they predict a bump on a graph representing an event (actually events) that nobody otherwise notices ever happened...
War often forces you to take risks that would be imprudent under normal circumstances.
You're mixing two uses of the word promise here.
One is like a guarantee. I promise to deliver a breakthrough.
Another is an expectation. There is the promise of a breakthrough.
The statement here means simply that there could be a breakthrough discovery and that the chances are at least relatively high.
Promising a breakthrough means I must either be lucky or force my results towards something that sounds good. That is bias.
Having an experiment where there is promise of a breakthrough simply means my experiment could deliver something huge.
I could fling Fabergé eggs at a wall and it'd be expensive but exceptionally unlikely to reveal anything big. Testing the warmth of fires lit with Rembrandts would be similarly unenlightening but expensive. Firing particles at each other at energies we've never tested before with newly designed detectors has a chance of a breakthrough (however you choose to define a breakthrough). Picking the latter over the former because it can give a breakthrough does not mean the experiments done with it are biased.
The point is though, these criticisms (that LHC might find nothing) had been making the rounds since before the LHC was built, while many promoters claimed we would for sure find evidence of supersymmetry. So while LHC may not have been a mistake, the right response now would arguably be to reassess fundamental theories in light of the new evidence accumulated at the cost of billions of dollars - not to go back and tweak the same old theories to suggest that many more billions and years need to be spent to make really really sure we were wrong.
If late 1800s physicists had spent decades building ever more accurate devices for trying to prove the existence of the eether that held together the universe, perhaps some useful engineering or data analysis work would have come out of it, but it could also be a way for the field to go on an extremely expensive wild goose chase and stall out actual theoretical breakthroughs.
How many mathematicians can you let loose on long-standing physical problems (qualitative dynamics of the large-N body problem, the freaking turbulence motion of fluids, etc.) -- at some level of "big bet" that frees them from staccato publication pressure -- with the money spent trying to find gluinos or some such ill-developed theoretical construct?
It's hard to imagine the output of that wouldn't be amazing.
After that, how do you separate the promising mathematicians from the lazy and the crackpots?
This is the same problem that a "Manhattan Project" to cure cancer or what have you always runs into: It's easy to see where to get value from the first dollar, but the 30 billionth dollar likely costs more than a dollar just to figure out how to productively spend it!
"Fortunately", experimental particle physics doesn't have this problem, since you can always use that next dollar to build a bigger collider.
I say, start a program with the Erdös-2 people and as it develops let these hire Erdös-3 folks.
so we should probably allow the smaller Erdös numbers to be inherited through primogeniture, to make sure we still have an identifiable class of good mathematicians to give money to.
It's not like the choice about what to spend money on is between "true" and "exciting". The choice is between "true and exciting" and "true and not exciting". We have to use something to choose what to invest in, so why not choose based on how exciting the potential discoveries are?
I think of it like central planning vs free markets. It can sound good to plan things out and direct things towards the outcomes you want, but it's less efficient.
Or a massive case of "hmm, that's odd". Like Fleming noticing that bacteria were not growing near certain molds.
This is true, but as a great philosopher said, you can't always get what you want. If you search for the truth, a lot of it won't be "cool". Some of it will be cool of course: but if you prioritize coolness over truth, it might prevent you from discovering anything at all. A cool lie isn't a discovery.
> If you look for truth, you may find comfort in the end; if you look for comfort you will not get either comfort or truth only soft soap and wishful thinking to begin, and in the end, despair.
-- C. S. Lewis
> A man may imagine things that are false, but he can only understand things that are true, for if the things be false, the apprehension of them is not understanding.
-- Isaac Newton
That's not true. In fact it's probably the opposite of true!
You do experiments specifically because you have some a priori reason to think that this experiment will tell you something interesting. In fact, one of the major ways scientists are trying to deal with the replication crisis is pre-registering their methodologies and expectations of experiments.
Which isn't to say it's not a good idea to do fundamental research, but it's absolutely valid to try and consider where funding should go based on what we expect to get from experiments.
Of course, I think the current process is pretty bad, since it relies so much on theatrics, as the OP mentioned. But I agree with OP here, at least in what should happen - physicists shouldn't hype or over-promise what an experiment can deliver. I just wish we lived in a world which valued these kinds of fundamental results enough to agree to support them financially!
There's a big difference between saying, "This experiment will tell me something interesting" and "This is the interesting thing that this experiment will tell me". The former is what you're describing, the latter is what I'm objecting to.
Hence the the inclusion of "at least" in the source quote, which wouldn't make sense alongside the other definition. See the dictionary links that have been posted multiple times.
The issue of the article is a resource allocation problem, and there is something unethical about bending scientific prognostications (these cannot be distinguished with the label 'hypotheses') to that end.
"Pursuing exciting results over the truth" doesn't come into it - no-one is accused of falsifying anything here (though it has happened elsewhere.) At worst, the truth will be delayed, though that might be the outcome of pouring resources into a bigger machine, rather than of not doing so.
What did we get out of it? The transistor, the laser, cellular technology, solar cells, and tons of other things that nobody would have bothered to research -- without the simple curiosity of our scientists.
But yeah, experiments, especially nowadays, are to prove theories, not to push breakthroughs - in fact, given the higgs boson was already theorized, science could already use it. Not sure what the LHC added to that besides proving it exists.
Theory verification is a very important part of physics. For example lots of people who work in string theory write down lots of crazy theories of how physics looks beyond the standard model. Perhaps they are all wrong, but we currenly have nothing better. So we would really love to have any currently experimentally viable way with which we were able to check these theories. Thanks to LHC we could do this at least for the Higgs boson. Before this experiment the Higgs boson was also "just a crazy theory that was able to resolve a hole in the standard model".
On the contrary, not knowing what to expect is exactly why you should spend money on it. If you know what to expect, there's no reason to spend billions of dollars testing what you already know.
You want to find out the distance to the Moon. You build a 100 meters high tower, but you still cannot reach the Moon. So is building a 200 meters high tower now a good idea? Maybe if you build your tower a little higher, you could finally reach the Moon.
Or maybe you should go back to the drawing board and re-think your whole approach.
No, but you have to have some sort of hypothesis to justify the experiment. You don't throw effort and money at the wall either, you make a guess about what you'll find, and use that to drive the decision as to what to investigate.
And new particles, at this point, don't qualify. The LHC was probably "worth it" for the Higgs result alone, but absent a new target (like the Higgs) that we really think will be there, no one sane would build another bigger collider at these budgets.
Whether to build something expensive isn't a scientific question, it's a political one.
To take an example to the extreme, if it were just about science, science might decide to convert the entire mass of the Earth into a particle collider and kill us all in the process.
Well, if you're building something expensive with the intent to perform science, one would hope that the political answer to this would be informed by science.
> To take an example to the extreme, if it were just about science, science might decide to convert the entire mass of the Earth into a particle collider and kill us all in the process.
I really don't see this. As a non-specialist, I assumed that the LHC was pottering along making useful if non-spectacular discoveries. The fact that naturalness is in doubt due to its data sounds exactly like the work it should be doing. Blame physics for not having a clutch of new particles ready for discovery, not physicists.
“If you can't measure it, you can't improve it” (Lord Kelvin)
So imagine we have some Physics setup were certain laws should hold. There is some magical formula called Lagrangian L = stuff. When you construct this L from scratch, you add everything you have. That is some e for the Electron field function, some phi for the Higgs, some m for its mass etc etc. At first this sounds like an insanely long and random equation. But because of all the constraints in your setup its 'only' one page long for the Standard Model case. Oh yes and then there are some stupid terms which must be really small to only _softly_ violate the constraints. E.g. CP-symmetry for strong interactions - violation of this symmetry hasn't been observed in nature "Strong CP problem". That's where fine-tuning has to happen at the moment...
However terms don't really cancel each other out. (I'm sure you could construct a scenario where that happens but that doesn't generalize.) It's part of the construction manual if you will to have only independent constants and terms.
That's why they're postulating, maybe if we had a slightly larger collider we could find sexy super symmetry. The author thinks that is disingenuous because the argument that a slightly larger collider would find supersymmetry is speculation, not based on real science.
Anyway, even if there was a solid argument for there being supersymmetry just around the corner I don't think a larger collider would be funded, the LHC offered many many sexy things, so the stars aligned and it got funded, but would the stars align again for such a huge amount of funding, just for supersymmetry? I feel as a layman the idea of supersymmetry is not captivating enough. Not in the way the higgs boson was.
Anyway, there's so much applied physics research just waiting to be done right now, maybe it's time for theoretical physics to chew on it for a bit.
I'm glad to learn that I was mistaken! As a layman myself, this is really all I want from the LHC: for it to continue to be a useful piece of equipment for scientific experments. From the article, it sounded like the attitude was, "we didn't find anything sexy, so we're done here," which would be a huge waste.
They're still running experiments on the Relativistic Heavy Ion Collider at Brookhaven National Lab in New York, for example, even though it started operation 8 years before the LHC and runs at a fraction of the energy.
What we really need is China to build a massive new supercollider as a national prestige project, to show their parity with the West. That might even spur some competitive spirit and get the West back into the game.
Feels like saying SpaceX has failed because it hasn't put people on Mars yet.
Another takeaway was the introduction (for me) to the concept of "naturalness", with which the author has some issues. It is however not possible to do away with it (if I'm not mistaken about its meaning), except in cases where the assumptions of naturalness turn out to be wrong, as it seems it was in this case.
It seems to me that some concept of "naturalness", is what we use to interpret empirical facts, without which we could not make sense of it at all. Examples of what I would consider "naturalness": that the past precedes the present, that large things contain smaller things (perhaps in infinity), ad that small things are contained in larger things (perhaps in infinity), etc.
Granted, our sense of naturalness could be completely wrong, and empirical data constantly challenge what we consider natural, which is how it should be.
"In physics, naturalness is the property that the dimensionless ratios between free parameters or physical constants appearing in a physical theory"
I have no idea what that means atm
What the naturalness property seems to be saying is that these dimensionless parameters (which you have to stick into the equations to make the math work out) should all be around the same order of magnitude and not require too much precision. If there is such a parameter that is super huge or super small (in relation to the others) or requires a lot of precision, it's an indication that the theory is incomplete. There should be an observable reason for that difference/precision.
If I've gotten that right, then I can see why people would be sceptical of naturalness. If naturalness was correct, I would actually be curious of why it was correct.
It just means that theorists are suspicious of theories where the parameters have to be adjusted to high precision to match reality. That's reasonable grounds for suspicion, but it's not natural law.
Worth reading just for this quote.
Everyone "knew" that the speed of light was constant, everyone "knew" earth had to be only couple of thousands of years old (e.g. Lord Kelvin was a firm believer in only a thousands of years based on his physically based estimates), atoms where quite a hard bargain to sell as anything as computational tools until you got some computations and Jean Perrin to do some experiments, https://en.wikipedia.org/wiki/Jean_Baptiste_Perrin, continental plates where supposed to be solid fixations, until they weren't, etc.
It's nice of Kuhn to point this out. Maybe it's convenient for administrators or something to realize accepted facts tend to change - but it gives absolutely no clue on how exactly move science forward.
I don't really understand the huge uproar about Kuhn - his ideas should be "obvious" to anyone familiar with history of science. But maybe as professional management and pathological "professionalism" made it's headway in his time it was nice to point out that Gant chartable progress was not all there was to it.
That is the reason why the address in your browser's URL line starts with "http(s):" followed by "www." It's not an exaggeration to say that the work of Tim Berners-Lee at CERN led directly to the creation of trillions of dollars of economic value.
The World Wide Web was eight years old when work on the LHC began. (I didn't need the history lesson, btw. I remember very well when that protocol was introduced.) Further, while I'm not trying to take away from Berners-Lee's invention of a new protocol for it, CERN did not invent the Internet. Lots of us were sending email and downloading files and chatting on IRC before http was created.
And it probably is, in fact, an exaggeration to say that http created trillions of dollars of economic value. The stuff that gets sold on Amazon and eBay and through Google ads mostly existed before the invention of http, and some of it may even have existed before the internet. Had http not been created, it is not hard to imagine a world where people still got their advertising through tv and magazines. And isn't that the main funding source for internet companies, and the main economic value that has been created -- advertising, I mean? I don't think it amounts to trillions of dollars yet.
Without a time machine, it's impossible to answer the question. As the old cliche goes, if they knew what they were doing, it wouldn't be called "research."
The World Wide Web was eight years old when work on the LHC began
As with any large-scale project, the planning and design processes began long before the first shovel hit the earth.
Lots of us were sending email and downloading files and chatting on IRC before http was created. ... And it probably is, in fact, an exaggeration to say that http created trillions of dollars of economic value.
Sure. Because Facebook and eBay and Google and Amazon and Wikipedia could have been built on IRC and Gopher.
CERN did not invent the Internet.
No one said they did. You might as well argue that CERN didn't invent fiber optics or copper wire. The Internet is not the WWW... but you knew that.
And isn't that the main funding source for internet companies, and the main economic value that has been created -- advertising, I mean?
The movement of information from one mind to another is not a zero-sum game. Reducing the economic value of the WWW to its present value as an advertising vehicle is misguided, if not downright fallacious, but you probably knew that as well.
We won't go into the irony of using machines built with ICs fabricated on nanometer-scale processes to argue about the potential future value of fundamental physics research.
I'm not saying CERN wasn't useful. It was. And the budget wasn't that big. But this kind of logic is not very sound. If we have a lot of money and a series of concrete problems, we should spend the money to solve them.
Most of the money are to be spend directly on the problem, less to develop and build new tools to tackle the problem, and finally even less to discover new possible mechanisms that might or might not allow us to improve our tools in the future.
There's a lot of uncertainty in the future, and it's best not to bet a lot of money on it.
This is not an option, if we had any project candidates like this they would have already been funded. Everything that is not already done by the private sector exists as a big step in to the unkown. (The private sector is very good at allocating resources to projects that give immediate results, but it will never do anything that can't.)
Because that's not how science works
Isn't it possible that current theories in particle physics are just simply inaccurate models of the world? They're just hypothetical low-level explanations of observed high-level effects, and could have been empirically proved by the large colliders, which doesn't seem to have happened.
So maybe we don't need new experiments, but new models. A negative result is a result too.
On a related note, the assumption in quantum physics that particles have a probability distribution rather than an exact location has always bugged me. Why can't there be low-level mechanisms going on that are too quick/small to be measured (today)?
>On a related note, the assumption in quantum physics that particles have a probability distribution rather than an exact location has always bugged me. Why can't there be low-level mechanisms going on that are too quick/small to be measured (today)?
It bugged a lot of other people too (Einstein's famous "God does not play dice" comes from the same corner). Experiments so far are only consistent with a probability distribution, unless you permit signals to go back in time. Of course how you interpret the model is an entirely different problem (do things only happen once observed, are we in a multiverse and all possibilities happen in some universe, are we simulated and those effects are caused by optimisations (both not calculating things until needed, and inaccuracies akin to floating point errors), etc. the possibilites are endless)
One is the "Many Worlds" interpretation. (AKA, the Everett interpretation.) In this QM wave functions never collapse, and the only reason that we perceive there to be probability distributions is that the lack of collapse results in many different apparent worlds. Though in reality, there is really just one very big complicated world, but the difference parts of it stop effecting each other via a property called "decoherence".
Another deterministic interpretation is the Bohm interpretation, in which the particles are push around by a "pilot wave", which is the same wave function that never collapses in the Many Worlds interpretation. Since the "pilot waves" never collapse in the Bohm interpretation, one might wonder, then why you don't also end up with many worlds here too, but it is taken that the reality that we perceive is always determined by the particles that are being pushed around by the pilot waves.
One point of interest is that the Many Worlds interpretation and the Bohm interpretation are experimentally indistinguishable from each other.
Regarding the postulation of a multiverse, this is almost certainly true, if you ask me. It would seem to be the only way to explain the apparent "fine tuning" of the universe. Unless you believe that it was tuned by God, that is. But if that's the case, I have a few nits to pick with some of the choices that he or she made.
There are (non local, contextual) “hidden variables” theories that can “complete” QM and restore determinism. But there is nothing to detect, the predictions are identical to “standard” QM (at least in equilibrium).
Interesting point. Money drives research.
It took over 400 years from the discovery of gun powder to be applied to the use of projectiles. While it only took 40 years from the mass-energy equation to create an atomic bomb.
They are about the only major country increasing government R&D funds.
The only exception would be if there is a breakthrough technology 100x more cost effective, i.e. you could build a 10x power LHC for a tenth of the cost. I see press releases of breakthroughs using lasers or EMF. But very unclear if they'd scale up to a petavolt.
Probably could build a pretty powerful accelerator using the world's electricity devoted to bitcoin mining :-)
This is the main point.
She totally nailed it.
If we could promise a breakthrough discovery, we wouldn't need to build the machine.
The author seems almost heartbroken at the absence of life-altering findings. No new discoveries means we at least know what we're doing a little bit, right?
That's not my takeaway. I think they were talking about how scientist shouldn't lie to get grants that probably won't achieve much.
This is what would happen if you set your null model to be something that was false regardless of these particles existing. The way it works is more belief -> more effort put towards detection. It requires a certain amount of time and funding to cross the "discovery" threshold so this will only happen if there is enough prior belief.
Plenty of obscure and unintended findings came long after experiments have concluded. If history is anything to go by, LHC data will be useful far into the future.
I don't think that politics are influenced by this at all and building something like an even bigger LHC, ah come on every physicist would love something like that anyway.
Other physicists have said similar; while still others, like Feynman, have said what sounds like the opposite.
Two recent planned breakthroughs: Einstein's theory of general relativity predicted the existence of gravitational waves, and LIGO detected them in 2016. Bell predicted that entangled quantum particles would exhibit fundamentally nonlocal properties, and in 2015 the first "loophole-free" test demonstrating violation of local realism occurred.
These experiments were done with expectations of a result. That is not to say that they had foregone conclusions, just that there was some phenomenon that the scientists hoped to see, and confirmation one way or the other would be of interest to the community. Most experiments are like this -- of course scientists should keep their eyes open for unexpected discoveries, but in general pursuing expected results is more fruitful.