
Ask HN: Is physics moving forward? - Murkin
Looking back over last 20 years, technology (esp. related to computers) had made extraordinary leaps forward and the pace is accelerating.<p>As a total layman in the physics world, it appears (looking from the benches) that things are crawling at the same speed they did 20 years ago.<p>Is that true ? Is the speed of physics advances accelerating as well ? What has been happening that we might of been missing ?<p>EDIT: Especially in the applied &amp; practical worlds
======
ISL
As someone who's been in physics for much of the last twenty years, I'd guess
that things are going at about the same rate. I don't think that the rate is
"crawling", though.

If you look on log plots of parametrized experimental progress, progress
remains linear, so it's a Moore's law like improvement on many fronts.

The emergence of precision cosmology has really transformed astrophysics in
the last twenty years. The solar neutrino problem is now solved entirely (and
even \theta_{13} has been measured!). The lynchpin of the Standard Model (the
Higgs) has been found. LIGO is likely to make first detection in the next
couple of years. Graphene and topological insulators have the solid-state
community buzzing. Fluorescence microscopy and nanopore techniques are making
waves in the biophysics community. And more, of course. Heck, this week, the
most compelling evidence yet for the long-sought pentaquark appeared.

For the probes of the dark sector and of gravity, though we haven't found
anything, huge swaths of parameter space (i.e. possible theories) have been
ruled out. EDM searches are relentless in their searches for new physics.
Someday, someone will find a reliable anomalous signal, but we can't predict
when.

And, if you're looking for fun hints of new physics, check out "muon g-2". The
next 5-10 years will be exciting there, too, to see if the existing
discrepancy between measurement and the Standard Model will survive closer
experimental scrutiny.

We go slow because we will go far.

~~~
sqrt17
What are the assumptions behind "log plots of parametrized experimental
progress"? What do these mean in the real world?

To give an example of why "pouring money into it and getting slower and slower
progress" could translate into linear progress on a log plot, imagine you have
to pour in x² amount of (money, people, time) to get to level x of progress.

In that case "we go slow" means, "we'll gobble up larger and larger amounts of
money, but everytime people get impatient we'll have some bone to throw at
them".

~~~
rn222
Add that to the fact there is no global dependency tree of "What we need to
discover to make this practical technology possible", how can we be sure that
existing/future research funds are focused in the areas needed to make the
most valuable technologies available?

~~~
CountSessine
Is that why we're doing physics research? Practical technology?

~~~
marcosdumay
Isn't it?

Understanding things is certainly great, but you can't even claim you
understand something new if you can not use that new understanding to achieve
something you couldn't before it.

Or, more clearly, pure theory is good for mathematicians, but scientists can
not even claim a theory is new if it has no application.

~~~
CountSessine
_Isn 't it?_

It's certainly not what _I 'm_ expecting. Even if it hadn't gone on to be
central to our engineering, I would have liked to have known about general
relativity or QCD.

 _Understanding things is certainly great, but you can 't even claim you
understand something new if you can not use that new understanding to achieve
something you couldn't before it._

What do you mean by _understand something new_? This week we started to
_understand something new_ \- understand many things new - about Pluto. Are
you trying to claim that this isn't true unless we can _use_ it somehow? What
is it you're trying to say?

 _Or, more clearly, pure theory is good for mathematicians, but scientists can
not even claim a theory is new if it has no application._

Is astronomy not a science? Or palaeontology? Or archaeology? Those are all
observation-oriented sciences that don't conduct experiments but do gather
information and develop knowledge.

I have no idea what it is you're trying to say about _claim a theory is new_
\- do you want to rephrase that? What would you say about the recent discover
of pentaquarks? Pentaquarks are useless, but our knowledge of them is quite
_new_.

------
stared
As a PhD in physics - progress in pure physics is slowing down (or at least
progress per scientist).

The delay between a discovery in physics and a Nobel prize is getting bigger
and bigger [1]. It's true for all fields, but the effect is particularly
strong for physics.

'''It is safe to say that late 1920s and early 1930s were the “Golden Age” of
20th century physics, when the progress was lightning-fast and new discoveries
lay like low-hanging fruits. In the 1940s Dirac commented bitterly, in view of
problems quantum field theory was having at the time: “Then, a second-rate
physicist could do first-rate work – now, it takes a first-rate physicist to
do second-rate work”. Every physicist would love to live in such “interesting
times”, when a new unexplored scientific territory opens up.''' [2]

So, its started decreasing quite some time. Also, as there are more an more
people involved, the brainmass gets diluted. As of now there is no chance to
get a photo of a concentration of luminaries as from the famous Solvay
conferences [3].

So, if you (like me) read Feynman lectures of physics and know physics from
1920s-1950s, you are likely to get disappointed by the current pace.

...and just compare to recent progress in machine learning, when we can play
on our computers with things, which a few years ago were thought to be out of
our reach.

(But it shouldn't be surprising, technologies and science do have their growth
time, and usually it's finite time.)

[1] [http://priceonomics.com/why-nobel-winning-scientists-are-
get...](http://priceonomics.com/why-nobel-winning-scientists-are-getting-
older/)

[2] [https://woodtickquarterly.wordpress.com/2011/11/17/graham-
fa...](https://woodtickquarterly.wordpress.com/2011/11/17/graham-farmelo-the-
strangest-man-the-hidden-life-of-paul-dirac-quantum-genius/) (BTW: I recommend
this book a lot)

[3]
[https://en.wikipedia.org/wiki/Solvay_Conference](https://en.wikipedia.org/wiki/Solvay_Conference)

~~~
pwnna
Do you think there's a limit on how much we can understand in physics/any
field?

My reasoning on this is that since everyone has a finite lifespan,
understanding/work will be lost as the researcher retire, move to a different
field, or die, unless this information is transmitted to someone else during
their life time. However, as we understand more, we have to teach people more.
This brings in two problems that I see:

\- It takes longer for people to learn up to the frontier of science, which
means they might be in their 30s, 40s, and that number may keep getting
larger. \- We might be able to teach these things in the same amount of time,
but the understand of these topics by the younger generation might be less
overall, which makes it difficult for them to make certain connections, as
they might not have the deep enough understanding to see a connection.

My worry is the eventually we will reach a stage where the amount of knowledge
we have is so much that no one can feasibly master a field. We can subdivide
the fields, but no one will know enough general knowledge to make insightful
connections that simplifies our models, or something to that effect.

As an example: my fluid dynamics professor said that the fluids mechanics I
course we take in second year undergrad is what people worked on as their
masters/phd thesis 100 years ago, whereas we don't even derive the formulas
anymore now simply because we do not have the time to do so.

~~~
mdup
The point of your comment is disproved by your last paragraph! The more
science is discovered, the more it gets compacted into the essentials of this
knowledge. If 100 year old PhD level fluid dynamics can be taught to
undergrads, this means it got digested into the most useful part of the work.

Similarly, it's considered basic undergrad work to understand Fourier
decomposition; but the notations and clarity we get the insight from is the
result of two centuries from the 1807 paper by Fourier. It was certainly PhD
level at the time.

Certainly science is compacted and that's how it gets transmitted to younger
generations. And remember that our brains get bigger, so maybe we'll never hit
any "limit" to understanding? :)

~~~
pwnna
My worry is that as we split knowledge out, no one will understand the big
picture as they're always focused on their specialty of their specialty.

Since we are teaching people less in depth stuff as they need to go very far
into science, it seems like building knowledge with unstable foundation. The
lack of branching out may also inhibit discovering certain connections between
fields that may proved to be enlightening.

------
sampo
If you have time, Lee Smolin's book The Trouble with Physics is a captivating
read.

[https://en.wikipedia.org/wiki/The_Trouble_with_Physics](https://en.wikipedia.org/wiki/The_Trouble_with_Physics)

Smolin makes a sosiological-historical argument, that from the 20s to 70s, the
development in theoretical particle physics was done by trusting the
"mathematical intuition", if the math was beautiful and predicted the
existence of some particles, a bit later usually the experimental physicists
found those particles. And the rewards and Nobel prizes went to those who did
the math the fastest.

So the whole theoretical physics adopted this style of work, when someone
proposes something new that looks interesting, everyone tries to do the math
as quickly as possible, to be the first to get the results.

But then from the 70s to 00s, this flocking attitude was applied to string
theory, and people just developed string math furiously, and it was left
unnoticed that the theory was totally unrelated to any experiments.

So, Smolin suggests, for 30 years the best of theoretical physics went into a
direction that may be totally separated from experiments. If this turns out to
be true, theoretical physics pretty much lost 30 years.

------
jammycakes
It seems to me that ground-breaking discoveries in physics are generally
getting more expensive.

Up till about the early 20th century, ground breaking physics was largely "two
guys in a garage" territory -- single individuals such as Newton or Cavendish
tinkering in their own private laboratories using fairly modest equipment.
Teenagers replicate their experiments in school physics lessons with equipment
costing no more than a few hundred pounds today.

Throughout the mid-twentieth century, ground-breaking discoveries were
increasingly made by teams of researchers, which seem to have grown larger
over time, with equipment that has become increasingly large and expensive,
and sponsored by universities, companies and governments.

Nowadays it seems that most ground-breaking discoveries are made by large,
national or multinational teams working with equipment costing billions of
dollars and processing petabytes of data. I couldn't see two guys in a garage
producing their own space telescope or particle accelerator any time soon.

~~~
johncolanduoni
Most physics isn't done with space telescopes or particle accelerators; and
very few groups require billions of dollars of equipment (considering the NSF
budget is less than 10 billion dollars, there wouldn't be much to go around).
Theoretical physics in experimentally relevant fields is about as cheap as its
ever been; you really only need professors, a building, whiteboards, and
travel funds.

And believe me, there are a lot of experimental physicists doing excellent
work with two guys in a garage levels of equipment.

------
richmarr
It's hard to read your question without thinking about how String Theory has
dominated Physics for that entire timeframe (and more)... and how ineffective
that family of models have been at progressing humanity's understanding of the
broader world.

If you believe Lee Smolin then advocates of String Theory have also had an
oppressive effect on opposing ideas, making it hard to get tenure if you're
not working on it, effectively choking off competing theories (e.g. Doubly
Special Relativity, Loop Quantum Gravity, etc)

I have no data or first-hand evidence, as I was only a mere Physics undergrad,
but found his arguments pursuasive. String Theory (and his oscillating pals)
have certainly been dominant in the scientific press, and don't seem to have
come up with much. Could be a false negative though. We'll only know when
someone opens the next big door.

------
abdullahkhalids
I am surprised no one has mentioned quantum information yet. Quantum
information is the subfield of physics whose goal is to understand what new
information processing tasks are possible/efficient that are not
possible/inefficient in the quantum world. Two such tasks are quantum
computing and quantum key distribution.

This field was birthed in the early 80s and has seen steady progress since
then. Various architectures for quantum computing have been proposed and
experimental control over them has steadily increased in each one over the
last few years. While we are not there yet the sequence of results shows that
we are rapidly approaching the fault-tolerance thresholds after which it will
be possible to build quantum computers.

Quantum key distribution is a simpler task and has already been achieved
commercially. Now we are trying to increase the rates of transfer. We are
slowly also relaxing experimental requirements. For instance there are
protocols where you don't have to trust that your devices where not tampered
with by an adversary.

These are exciting times in quantum information. While important theoretical
results where found in the 80s and 90s the experimental momentum today far
outpaces it.

~~~
stared
Quantum information is indeed a new field. Yet, I wouldn't compare it with 20s
or 70s. In the last 20 years there was hardly any paradigm shift in that
matter (don't be fooled by poor science reporting/popularizing or scientist
overadvertising their results, in search for fame and grants).

And for example, when I as attending conferences in quantum information (my
PhD field), people were constantly lowering their expectations and making
predictions more humble; much more "10 years ago we said it would be in 10
years; now we can say the same".

I don't claim that there is no progress. Just that is depressingly slow
comparing to the frontier fields of science in engineering. (If it were fast,
believe me, I would have stayed in physics.) Of course, future may be
different, but who knows...

Compare it with e.g. DNA sequencing, where costs went down quite a few orders
of magnitude in the last decade. Or image recognition, where in the last
decade things though to be extremely hard (because of problems in 70s) like
face recognition, image recognition etc - are now standard techniques.

~~~
abdullahkhalids
I myself am getting at PhD in quantum information. I completely agree that the
timeline for quantum computing was extremely optimistic 20 years ago. However,
I think current estimates are much closer to the truth. We now have a much
better understanding of the systems on which we are building QCs. Estimates of
fault-tolerance thresholds have become much better. We have 20 years of
experimental progress to extrapolate when we expect to hit these thresholds.

I agree that progress in the 20s and 70s was much faster. But they were
grabbing onto low-hanging fruit. Building even a 'bad' quantum computer is
several orders of magnitude more difficult task than building, say, a 486
processor. The degree of experimental control required is much much more. What
I see looking back is steady progress towards greater experimental control in
multiple systems and very frequent and steady achievements of milestones.

What you forget when you compare quantum computing with DNA sequencing or
image recognitions is that quantum computing is at the bottom of current
theoretical paradigm. When you do DNA sequencing the science of your
instruments (eg. centrifuges) is not suspect. In QC everything is suspect;
your system, your detection mechanisms, your control systems. If you normalize
every field by its fundamental difficulty you will find that quantum
information is keeping up with other fields.

~~~
stared
I would argue that this difficulty is not more fundamental than difficulty of
producing light bulbs by Edison.

The thing I do not want to normalize w field by its difficulty - then progress
would be just related to the effective number of smart people working on it.

I was raised in a scientific culture (Central/Eastern Europe), where
difficulty was a virtue, not - impact. But that way produces a --lot-- small
number (because they are difficult!) of difficult results, with little impact.

Likewise, the most progress happens where there are low-hanging fruits.

------
PaulHoule
It is the best of times and the worst of times.

In fundamental physics, the LHC is online, neutrino physics is hot, and lots
is going on. Now there are a huge number of quantum theories of black holes,
but no way to prove anything about them in site.

The dark matter problem is a huge "anomaly" left to solve so there are still
mountains to climb.

In terms of practical stuff there is lots of physics in how you build a 7nm
microchip. Physicists collaborate a lot with "nanotechnology" people and
biologists. For instance my thesis advisor worked with experimentalists who
were stretching DNA with tweezers and figured out how the AIDS virus self-
assembles.

Even the "dead" area of chaos theory is looking much better now that people at
NASA have made a map of the earth-moon phase space which can give a km/sec or
so free propulsion.

~~~
JohnBooty

       Even the "dead" area of chaos theory is looking much 
       better now that people at NASA have made a map of the   
       earth-moon phase space which can give a km/sec or so    
       free propulsion.
    

This is fascinating - how can I learn more?

------
dannypgh
This all depends on your frame of reference.

------
vezzy-fnord
_Looking back over last 20 years, technology (esp. related to computers) had
made extraordinary leaps forward and the pace is accelerating._

Largely hardware advances that are heavily interrelated with physics. Even
still, it has been incremental (though still impressive) production advances
and not radical architectural redesign. Intel only got bounds checking in
hardware (MPX extensions) a couple of years ago, even though this was first
done over half a century ago.

Software hasn't.

~~~
ghaff
Yeah, as per some of the discussion of another recent post, one of the big
things that has happened over the past 20-30 years in computing is that we
discovered a technology for use in digital logic circuits that has been
amenable to incredible process shrinks. Certainly there's been extraordinary
engineering that's gone into that shrinkage coupled with no small amount of
semiconductor physics. But, for whatever other advances have been made in
computing since 1980 or so, an awful lot comes back to CMOS.

------
spott
This depends on your definition of "advances in physics".

"New physics", physics that we don't already have an explanation for, is
becoming increasingly rare. Our definition of "new physics" is also expanding
to including things that aren't fundamentally new, just small holes in our
understanding of the equations.

The flip side of this is that while we already know the broad strokes, there
is still a lot of work to be done in filling in the details. This work is just
less glamorous and doesn't make the headlines.

~~~
sampo
> _physics that we don 't already have an explanation for, is becoming
> increasingly rare_

We don't understand why galaxies rotate the way they do, as the visible mass
does not correspond to the rotation according to known laws of gravity. The
solution is to postulate dark matter and dark energy, until things match
again.

------
cozzyd
(Young) experimental particle physicist here. While our current progress in
particle physics has slowed down since the 70's, we've still learned quite a
bit in the last 20 years. For example we've observed the top quark, tau
neutrino and Higgs boson. We have also learned lots of stuff about neutrinos,
like that they have mass and oscillate. There has also been amazing progress
in detectors, although that's mostly behind the scenes.

Other fields of physics have had much more interesting discoveries though
(e.g. graphene).

------
98Windows
If you look at quantum computing [1], which is very much a part of physics, I
would claim that progress is accelerating there. Its a field that is asking a
lot of deep questions about reality and also spurring a lot of technological
innovation.

[1]
[https://en.wikipedia.org/wiki/Timeline_of_quantum_computing](https://en.wikipedia.org/wiki/Timeline_of_quantum_computing)

------
irremediable
Well, to be fair, lots of computer technology advances were associated with
advances in applied physics. Die sizes, MEMS, battery technology, etc.

Maybe you mean more theoretical physics? I don't really have enough knowledge
to be useful there.

------
FiatLuxDave
As a physicist for the last 20 years, I suppose I am to partly to blame... ;)

I would say that it depends on what you call physics. The areas of physics
that were symbolic of advancement in the previous 20 years will not
necessarily be the same areas where advances will be in the next 20 years.
During the 1800s, advances in physics in the first half of the century lead to
technological advances in steam power and electricity in the second half. If
you thought of physics as meaning electromagnetism and thermodynamics, you
might think that there were few advances in physics in the first half of the
20th century. And there were people that felt that way! But I think that
nowadays we would think that Einstein's heyday was an era of major physics
advances. So, maybe you shouldn't look at those areas that are reaching the
plateau of their sigmoid curve, but newer areas.

Classic particle physics, exemplified by the likes of Chadwick and Lawrence
way back in the 1930s, leading to the explosion of the particle zoo in the
1950s, and then the diminishing returns of the LHC era, would be a good
example of sigmoid curve development in Physics. If physics only means this
stuff to you, then yes, it is going slower than it was.

Areas of physics that have been closer to the high slope region of their
development in the last 20 years:

Quantum Information Theory (as noted by @abdullahkhalids)

Dark Matter/Energy/whatevertheheckitis

Medical Physics (from lead block linacs and xrays to IMRT/VMAT & modern
imaging)

Materials Science/'Condensed Matter' physics

Black Hole science, esp. thermodynamics

Gravity waves, or the lack thereof (interesting either way)

techniques for signal/data processing and analysis (such as superresolution or
single detector imaging)

I'm sure there's more that I'm not aware of. Anything that is really very new
is too small to get much press right now. Pretty much by definition, the new
small stuff won't have the big press budget of CERN or NASA.

------
batbomb
Physics has scaled out. There's a bunch of people working on a bunch of
projects with, for the most part, narrowly defined goals. It's really to
understand the implications of many of these projects when they succeed, let
alone report on them.

Government spending in the sciences has been extremely schizophrenic over the
last 5 years. The NSF is a little more stable, but the NSF doesn't fund much
of anything over 100M, which would be a relatively small experiment when
spread out over 4-5 years.

------
artimaeis
I'm not sure there's been any acceleration of discovery in the hard sciences.

The 30-40 years have seen tech advancements roughly in line with Moore's law,
which is fantastic. But that's the result of engineering advancements. There's
no precedent I'm aware of to expect the same results out of the hard sciences.

~~~
3JPLW
I'd disagree. New tools for biologists in the last 20 years have strongly
accelerated the field.

The trouble in Physics is that the new tools look like the LHC.

------
johncolanduoni
String theory often gets ragged on for not having any direct practical
applications, which is certainly true. However, some of the mathematics
developed by string theory is key in theoretical work in fields where theory
and experiment have a much tighter bond. For example, topological quantum
field theory[1] has found widespread important applications in quantum
information, and it was pioneered by none other than Witten himself. This
isn't just on the theory side; experimentalists are looking at topological
states of matter for a variety of applications, including quantum computing.

[1]:
[https://en.wikipedia.org/wiki/Topological_quantum_field_theo...](https://en.wikipedia.org/wiki/Topological_quantum_field_theory)

------
nathan_f77
I really like these diagrams, as an answer to your question:
[http://matt.might.net/articles/phd-school-in-
pictures/](http://matt.might.net/articles/phd-school-in-pictures/)

You can't predict breakthroughs. A scientist might spend their entire career
following their research to a dead end. That's not a failure, in my opinion.
It just shows how much we already know, and I think we should honour those
scientists just as much as the lucky ones.

~~~
hn9780470248775
The trouble with that diagram is that it implies that everything after that
initial boundary traversal is a completely new and original contribution to
human knowledge. Sadly this is not the case.

------
paulpauper
While progress may seem slower, we have a lot more people working on these
unsolved problems. Also there is the fusion of abstract math and physics,
which creates thousands of 'physicists' out of unwitting mathematicians. Due
to the physical limitations of experiments, most forthcoming progress in
physics will be purely abstract. But just because we can't test some of these
theories doesn't mean we should disregard though, provided there are as few
logical inconsistencies as possible.

------
euske
Has software technology gotten much advanced in the last 20 years yet? It
became a lot bigger, yes, and now we have GC or type inference if you're
lucky. But I'd be still afraid of writing a mission-critical part of software
as well as I would in 80s. Also things like Heartbleed happened. In terms of
the quality/correctness, the progress of the software industry has been
disappointing to me.

~~~
jeff_marshall
I've got to say I'm rather amused by the notion that type inference and GC are
innovations of the last 20 years.

Regarding now vs. the 80s, we have much better tools for making assurance
cases for critical software. So much progress has been made in both
specification correctness and implementation correctness that I think the only
way you could compare the two and say there hasn't been improvement is if you
haven't tried to see what we can actually do now vs. then...

------
japhyr
I did a BS in physics in the early 1990s, but I haven't kept up with current
research.

Can someone recommend some good reading to catch up on what's been happening
since then? I'd love a decent book about qm or cosmology that's not quite a
textbook, but also takes more than a simplified approach aimed at people with
no background in the field.

------
mjfl
A good way to look at this objectively is to go to scholar.google.com, type in
"physics" in the search bar, and limit your results to the past 10-20 years.
You will have to filter through the books and survey articles (get a couple
pages in), but you should get a picture of the more important articles in the
past 20 years.

------
empath75
Not a physicist or a scientist, but I do follow the news, and I think there's
a lot happening 'under the surface' with people working on mathematical
foundations (category theory, complexity theory, homotopy type theory) that
are going to surface into physics in unexpected ways in the fairly near
future.

------
saulwiggin
I just finished a PhD in Transformation Optics and Metamaterials.
Metamaterials are a real novel breakthrough in physics which have wide ranging
applications for antennas and electromagnetic materials. Not to mention the
recent confirmation of the Higgs Boson. Physics is moving along at a similar
pace.

------
batou
As someone dipping into physics very late after a long journey on mathematics,
this thread is both depressing and terrifying yet at the same time strangely
motivating.

I think there are some big questions without answers still. I want to have a
bash at them.

------
brudgers
_Lex III: Actioni contrariam semper et æqualem esse reactionem: sive corporum
duorum actiones in se mutuo semper esse æquales et in partes contrarias
dirigi._

Only if something is moving backward.

------
yk
That depends, how do you quantify progress or rate of progress in physics? To
compare directly to technology, Moore's law seems to indicate a accelerating
pace of technology. On the other hand looking back to the 90ies progress in
computer games, where you could expect a never seen before breakthrough each
year, it seems that computers have hit a point of diminishing returns and
probably the important metric is something like the log of computing power or
something similar. So quantifying rate of progress is not well defined and it
seems that it is possible to argue that the rate of progress has slowed down,
even for computing power. The counter example would be digital video, where
there was very little progress, for the average user, until divX and since the
early two thousands, we went from cut scenes at 320x240, to Youtube and 4K
video.

Putting the measurement problems aside, progress in physics seems to be a lot
less smooth and the big jump occurred in the first three decades of the last
century with the discovery of special relativity and quantum mechanics plus
the ongoing project of formalizing physics. That one was a complete paradigm
shift towards mathematical models and towards an entirely different picture of
reality. Since then the development of quantum field theories is basically
just using the same trick as for quantum mechanics. ( Not trying to belittle
the development of QFT, that is one of the monumental achievements of the
human mind, it just pales in comparison to the development of QM. ) So, in
this view the next big jump may be just around the corner or not possible for
a human brain, but we will only know the answer after progress happened. ( I
should cite one of the famous philosophers of science here, unfortunately I
forgot which one.)

As an example, string theory is currently not even wrong, because we can not
build the known experiments that would enable us to test string theory.
However a lot of brain power and ink was expended on its development over the
last thirty years, and we simply do not have a good idea if it was worthwhile.
If someone suggests a experiment that can distinguish between string theory
and other models of quantum gravity, and if a string theory passes this test,
then it was probably worthwhile to spend all that effort.

In conclusion, I would argue that the question is ill defined and runs
furthermore in epistemological problems, that is even if we would find a good
definition we can not really know the answer. However, I am actually quite
optimistic that a breakthrough is just around the corner. For example, I think
that the connection of information theory and physics is not really
understood, but concepts like entropy and information seem to crop up
everywhere one looks.

------
redwood
There are a number of unexplained phenomena, easy to forget about but
nevertheless essentially waiting on "new physics" to model.

------
ape4
Maybe physics is in a (relative) plateau now. Just waiting for the big break
through. eg ability to manipulate dark matter.

------
naturalethic
Nobody here is going to like this comment but I recommend you look into
Electric Universe theory and Plasma Cosmology.

~~~
jboggan
It all sounds rather pretty but I'm not aware of a single testable hypothesis
associated with it, or at least a testable hypothesis that anyone is willing
to put forward and have immediately demolished. The Rosetta mission should
have turned up some intense magnetic field readings if the Electric Universe
theory is true . . . but it didn't. Then again the only people I know who
espouse the theory can't begin to explain what electromagnetic fields are
anyway.

