
LHC physicists embrace brute-force approach to particle hunt - lainon
https://www.nature.com/articles/d41586-018-05972-7
======
yk
So the last hope collider people are getting desperate?

To explain a bit, the well known trouble with today's physical theories is,
that on one hand we know that general relativity and quantum field theories do
not fit well together, on the other hand we don't have a good idea of the way
forward. So the hope was, that LHC would show something interesting, like
supersymmetry, or at least deviations from the standard model. So far it only
shows precisely what one expected before, so the theoretical problems are
precisely the same as before.

~~~
andrepd
Not really. We have several ideas going forward, it's just that no experiment
so far has produced evidence for them. On the other hand, a null result is
still a result. Of course it would be cooler to witness supersymmetric
partners, but the fact that we searched and didn't find them at those energies
is a result in itself.

~~~
vectorEQ
null result is still a result. thats cute. but looking at the budget of LHC i
would hope for a bit more actual result. instead of using backward logic to
promote failure as success.

ah, i have this wonderful theory.... i didn't prove it yet. but that doesn't
disprove it... >.> what happened to solving real-world problems, with
practical solutions, and doing science along the way? we used to discover many
interesting things about nature and the universe in just that way, and at the
same time this attitude changed, the 'null result is also a result' cult
started to eat our brains and pollute our thought.

~~~
empath75
To demonstrate the importance of null results:

[https://en.wikipedia.org/wiki/Michelson%E2%80%93Morley_exper...](https://en.wikipedia.org/wiki/Michelson%E2%80%93Morley_experiment)

> i didn't prove it yet. but that doesn't disprove it

A null result actually disproves a theory.

~~~
amelius
That's not a null result, but a negative result. A null result is a result
that tells you nothing new (by definition). At least, according to the
parlance in this thread.

------
hinkley
Not a physicist, so maybe I'm misunderstanding...

How is smashing protons into each other at relativistic speeds billions of
times _not_ a 'brute force approach'?

~~~
trentlott
It is physically, but not scientifically.

Before they outlined what they were looking for. Now they're just sifting.

Most science doesn't allow you to propose "We will expose A to X and see what
happens", but rather expects a hypothesis: "We expect Z following X>A because
of [rationale]."

~~~
jerf
"Most science doesn't allow you to propose "We will expose A to X and see what
happens", but rather expects a hypothesis: "We expect Z following X>A because
of [rationale].""

Which I consider one of the major flaws of the current scientific practice.
There's _absolutely nothing wrong_ with trawling over data looking for
interesting things. You just have to be more statistically careful. Huge
amounts of science have been done by just trawling over things or rapid-fire
throwing theories at the wall and seeing what sticks. Having to always call
your shots means you can only slightly push the frontier back. There's a place
for that, which is probably "the vast majority of experiments". But we need
that more exploratory stuff too.

But somehow we got entrenched with a terrible oversimplification of science as
"The Way Science Is Done", which hyperspecifies the parameters of the ways we
can modify our confidence values in various theories. There's more valid
choices than is currently considered valid, and we're missing out.

------
Blackbeard_
[https://arxiv.org/abs/1710.07663](https://arxiv.org/abs/1710.07663)

Those interested in a (somewhat) accessible view of where particle physics is
at in a big sense could read this article from the head of our theory group at
CERN.

------
noobermin
Correction, the group leaders have decided that their unwilling graduate
students should embrace the brute-force approach.

~~~
iainmerrick
It’s the machine. The machine has to be configured. Yes, there are group
leaders who have the final say on the configuration, and no, they aren’t grad
students.

~~~
Blackbeard_
The article is just discussing a type of data analysis. The machine runs the
same way regardless.

------
abvr
But it's always been a brute-force approach which is usually used at CERN, for
detecting and analyzing the particles that may arise out of the various
particle accelerator collisions. It is however a wiser approach to probably
involve the current hardware capabilities to crunch the vast amounts of data
through some statistical analysis or algorithms to sift through so much data
without much human intervention and obtain results that have lesser false
positives, as those are one of the drawbacks of using the brute force
approach. Further research in these areas which could then help improve the
accuracy of these models an then be reinforced into the chain and obtain only
the required data to then manually observe.

------
edw
The article never outlines what the alluded-to downsides are (p-hacking,
spurious correlations) which makes me wonder if the article’s sourcing leaned
too heavily on advocates of this theory-free approach, which was the subject
of a Wired cover story a few years ago, so it must be true, amirite?

Reading the article makes me wonder what Woit (author of Not Even Wrong book
and blog) thinks about this and how it dovetails with the epistemological
morass the string theory people have gotten themselves into.

------
supernovae
I wish we had finished the SSC :(

~~~
pasbesoin
[https://en.wikipedia.org/wiki/Superconducting_Super_Collider](https://en.wikipedia.org/wiki/Superconducting_Super_Collider)

A somewhat early harbinger of the U.S.'s decline in public funding of
fundamental science -- and all the practical discoveries, engineering, and
knowledge that derive from same.

From random news stories I encounter, it appears that "they" keep trying to
shut down FermiLab. However, despite being "older" and "smaller", FermiLab
continues to make important contributions. And among other things, they
contributed significantly to the Higgs Boson work.

Not mentioned in the Wikipedia article, was the debate over whether the SSC
should even have been located in Texas -- apparently, there were significant
political factors in the decision. IIRC, there was counter-argument that it
should be located as close as possible to FermiLab, for synergy and efficiency
in continuing its work.

There are the cited "costs". However, these project seem to invariably produce
a lot more practical value than they consume. Cryogenics, superconducting
technologies, data processing and communication, and on.

As I've said before, if you appreciate your medical MRI...

Not to mention NMR contributions to chemistry and biology...

Etc. It all hangs together.

And killing fundamental research is kind of like killing the goose that lays
those golden eggs.

~~~
bonesss
> apparently, there were significant political factors in the decision

As I recall the history this project is a near-perfect example of how pork-
oriented-spending is not congruent with R&D projects. Everything from site
selection to subcontracting was divvied up to appeal to the relevant
congressmen. These unnatural constraints caused delays, logistical issues, and
massive cost overruns. Their ability to solve those issues was hamstrung by
the projects organization. Thanks to the dependencies they were a ways in
before this all blew up and the project had to be stopped.

Building something like the SSC or LHC is fundamentally _hard_. Making someone
do something that is _hard_ while twirling a baton for political pleasure,
much less getting critical components from two suppliers who can't integrate
until construction, almost guarantees failure.

The correct approach, ironically, is perfectly embodied by the post WW2
machinations of the US government... Make the goals clear, dump appropriate
money in engineering and technology organizations, let the nerds figure out
the devlish details, and enjoy your A-bomb blowing up on time and schedule.

~~~
smueller1234
Just to add to that, maybe the most egregious example of this "division of
work by politics" is actually ITER?

------
brianberns
The article never clearly describes the brute-force approach. Is it just a
matter of looking for statistical outliers along all the different dimensions
they can think of? How would they then distinguish the "interesting" outliers
from random flukes?

~~~
Blackbeard_
The gist is that you would roughly look for deviations from theory on one
(probably small) dataset, you'd identify any regions of interest and then run
a proper analysis of that region on a different dataset to see if the effect
is real.

~~~
smueller1234
Caveat: Been a while since I studied this stuff.

The problem with this approach that I see is that it's much more likely to
pick up lingering detector issues than the regular test-a-theory. (The
difference between looking for a specific thing in a specific place vs picking
up anything unexpected.) I wouldn't worry do much about purely statistical
artifacts because those can often be worked out with prescriptions and
remeasuring. The systematic but not-understood biases are the ones that should
plague this approach since in the extreme they would need an independent
experiment. I wonder whether CMS and Atlas are sufficient for that.

~~~
ThaJay
Testing for detector issues and those sort of things seems like very important
groundwork for future experiments.

------
est
so LHC is basically in 大力出奇迹 mode?

------
vectorEQ
time to stop learning to break things into even smaller pieces and do
something useful with the trillions of investments.

~~~
russdill
The CERN cost is just a few billion. The only things we take apart that have
budgets measured in trillions are middle eastern countries.

