Hacker News new | past | comments | ask | show | jobs | submit login
Superconductivity Theory Under Attack (phys.org)
95 points by dnetesn 4 days ago | hide | past | web | favorite | 47 comments





One of my great surprises coming into my masters degree in condensed matter theory working with superconductors was just how poorly we understand them.

We have these incredibly seductive theoretical frameworks for thinking about superconductors which we know are deeply flawed, but it seems nobody (myself included) has the vision to see an alternate way of looking at them.

The tyranny of the BCS theory of superconductors is real.


I wouldn't so casually dismiss BCS theory. BCS contains one of only a few exactly solvable wave functions. It's used in everything from superconductivity to neutron stars. Actually, anytime there is a fermi sea instability BCS is a good starting framework. It's an triumph of modern physics.

There’s a reason I call the BCS theory “seductive”. It’s striking, beautiful and powerful. I also think it’s clear that BCS is far from the whole story when it comes to superconductivity and I also think the huge success of the BCS theory has cast a long shadow which makes developments in different directions more difficult.

> The tyranny of the BCS theory of superconductors is real.

Are you saying that the BCS theory is wrong? That it's limiting people from looking at alternate ways that could also superconduct? Or what?


The BCS theory does a reasonably good job of describing the properties of so called ‘conventional superconductors’ which are actually a very small class, dwarfed by the ‘unconventional superconductors’ which include almost everything high-Tc.

However, BCS is built upon a pile of questionable assumptions and simplifications and there are signs that it’s not even a good theory for describing conventional superconductors. I think of BCS as something of a ‘local maximum’ in the space of theories describing superconductors and we’re all stuck trying to find ways of incrementally improving the theory when really we’re going to have to make a large leap to a quite different description before real progress is made.

It’s like climbing a tree to get to the moon. You get a little closer, but to get any further you’re actually going to have to climb down and do a lot of work on earth before getting any higher.


With the fractional quantum hall effect, it should be clear at the latest, that neither the common electron nor the BCS superconductivity model can be true.

I follow Stoyan Sargs "Basic Structures of Matter - Supergravitation Unified Theory" model, which has a very different understand of SC and solves most problems in physics.

Due the fact that the electron is also quite different, the FQHE becomes easily explained. The Electron is a open, complexly formed, 3 body object with internal oscillations. In a superconductor environment, the overall structure disintegrates and build a train of electron shells and positrons. Same happens under certain effects that "create positrons". According to BSM, you can't create positrons, only shoot them out of their electron shell.

Superconductivity is only a state of the vacuum in which the energy of the CL node is below a certain energy threshold, that sits between the parma and diamagnetic domains. Chapter 2, 3, 4 and 6 explain this in great detail. They are surely 200 pages together, so explaining the details enough for this to make sense is unfortunately outside the scope what a comment field can provide :)

If you wonder, the reaction of electron + positron creates a compound particle of electron + positron mass but neutral charge as well as photons. It's relative stable and quite hard to detect.


> With the fractional quantum hall effect, it should be clear at the latest, that neither the common electron nor the BCS superconductivity model can be true.

I don’t think you’re interpreting the outcome of the FQHE correctly.

> I follow Stoyan Sargs "Basic Structures of Matter - Supergravitation Unified Theory" model, which has a very different understand of SC and solves most problems in physics.

I don’t mean to be rude here, but I went and looked up this book and found that it was full of crackpot nonsense, and nearly all the reviews on the amazon page seemed to be planted fake reviews. So I’m going to be as clear as I can here in case anyone not in the field gets curious about this POV:

This is nonsense.


I studied philosophy and physics in my youth and I can distinguish between a genius and a "crackpot" very well. Details are what counts, you can get every model to fit roughly if you don't care about the number of constants involved. If the model is correct, it also works for extreme situation. FQHE is the extrem situation that breaks the electron and SC model.

I have not heard any "crackpot" to distinguish between classical logic and mathematical one, because this is something only well studied people with background in logic and philosophy of science do. This was on of the hints that made me realize this guy is good, really good.

Science is always 80 years ahead of current discussion: https://www.pnas.org/content/112/24/7426

We are talking about interdisciplinary physics here, it's the top rank.

It took me a year to really understand Stoyans Model. Now I'm in a paradox free, consistent world view with the fewest possible assumptions you can possible make. For me galaxies and the the universe as a whole, is the logical and deterministic consequence from the lowest number of fundamental particles you can have. The world became a very complex, deterministic machine of utter beauty.

I understand the fine structure constant, 137, relativity (I only can smile on your implementation of it), time, Newtonian mass, magnetic fields, planetary fields (origin), gravity, electron orbit conditions, the periodic table in it's fullest (why), quasars, pulsars, globular clusters, periodicity of the redshift, lyman alpha forest, black holes, super-massive black holes (totally different object), photons, beta-particles, ... the list goes on.

I understand the internal discrepancies inside the standard model, I know for a fact, that most parts of it are in fact falsified.

If you bring up a error in the math equations or a logical error in the theory of his, I would be glad to discuss this in detail. So far, I have not found a problem and in fact, came to the conclusion that this is the first model I can remotely accept as true, because it is complex and not complicated.

Just some weeks ago, another confirmation from a different model with quite close values was presented at a physics conference I attended. His was ~10^27 N/m² and in BSM its 1.3 * 10^26 N/m² of Vacuum pressure, a hidden variable in the standard model.


> I studied philosophy and physics in my youth

> It took me a year to really understand Stoyans Model. Now I'm in a paradox free, consistent world view

...and that Sarg guy published his genial solutions almost 20 years ago, but during all that time nobody but you understood how big genius he is. He even contributed about cold fusion, because yes, that's where he'll finally be understood!

Using the "crackpot index" you and he (on his own page, I won't put a link intentionally) both earn so many points it's not even funny.

Some selections from the "crackpot index" follow:

http://math.ucr.edu/home/baez/crackpot.html

5 points for each mention of "Einstien", "Hawkins" or "Feynmann".

10 points for each claim that quantum mechanics is fundamentally misguided (without good evidence).

10 points for pointing out that you have gone to school, as if this were evidence of sanity.

10 points for beginning the description of your theory by saying how long you have been working on it. (10 more for emphasizing that you worked on your own.)

10 points for claiming that your work is on the cutting edge of a "paradigm shift".

40 points for claiming that the "scientific establishment" is engaged in a "conspiracy" to prevent your work from gaining its well-deserved fame, or suchlike.

40 points for claiming that when your theory is finally appreciated, present-day science will be seen for the sham it truly is. (30 more points for fantasizing about show trials in which scientists who mocked your theories will be forced to recant.)


Do you think it's likely that the Standard Model cannot sufficiently explain superconductivity? Or alternatively, that it can be perfectly explained, but the actual analysis is just super difficult?

I find this interesting because pretty much everything else in day to day life seems to have been explained by the standard model minus gravity.


I would be extremely surprised if we needed anything beyond non-relativistic quantum mechanics coupled to classical electrodynamics to describe the next theory of superconductivity. If there is anything beyond that required, I would be absolutely stunned if it was more advanced than relativistic quantum electrodynamics.

The trouble with condensed matter physics has never been that the underlying phenomena are complicated. We’ve had a more sufficient grasp of the microscopic physics of materials since the 40s. I could write down the complete ‘theory of everything’ Hamiltonian for a condensed matter system in like two lines. The trouble, is not the fundamental building blocks, but how you take a mathematical description of electrons bumping into ions and then generalize that to 10^23 electrons and ions.

The game is to make an approximation that gets rid of the stunning complexity of the full theory without while still preserving the features that are relevant to the problem you’re trying to solve.

Imagine trying to understand how a modern computer running a video game works, but all you understand is the basics of logic gates. Sure, in principal you could understand what’s going on in terms of bit flips, but it’s hopeless in practice. Interacting, strongly correlated quantum mechanical systems are very literally exponentially more complex than a classical system like a computer.


>Imagine trying to understand how a modern computer running a video game works, but all you understand is the basics of logic gates. Sure, in principal you could understand what’s going on in terms of bit flips, but it’s hopeless in practice.

See "Could a Neuroscientist Understand a Microprocessor?":

https://journals.plos.org/ploscompbiol/article?id=10.1371/jo...

(which is actually looking at transistor-level data, not gate-level, but the games examined are simple ones)


Yes, I was thinking of that article when I wrote the above comment. Thanks for the link!

Thank you. This is encouraging me to believe that huge materials science progress can be made in the next 50 years driven partially by computational advances.

That depends on what you mean by computational advances. If you mean advances in computer hardware, I wouldn't hold your breath on that helping much since these are generally exponentially scaling problems meaning that tiny incremental improvements in simulation fidelity or system size require doubling the computational resources.

Quantum computers, if large enough ones ever materialize could give us a revolution in computational many-body physics because they could actually circumvent this exponential scaling.

The most promising straightforward avenue though is actually just developing better algorithms that make more insightful, appropriate approximations to correctly capture the physics we care about. That may or may not be what you meant by computational advances.


> The trouble, is not the fundamental building blocks, but how you take a mathematical description of electrons bumping into ions and then generalize that to 10^23 electrons and ions.

Has anyone tried to generalize it to, say, 10^2 electrons and ions? What is the smallest system that empirically exhibits superconductivity?


Most Quantum Monte Carlo work on Fermionic systems deals with a number of particles on the order of 10^2. The numerical work suggest even such a small number can superconduct and we have seen experimentally that there are superconducting nano-particles that can be very small indeed. I don’t recall off the top of my head how many atoms are in the smallest ones we’ve seen though.

As far as pen-and-paper theoretical work goes though, something like 10^2 atoms is actually more difficult than 10^23 of them because 10^23 is basically infinite for our purposes so there are all sorts of useful limits one can take. When you consider something like 10^2 atoms, many useful approximations go out the window. You can no longer ignore the physics at the edge of the material since everything is very close to the edge, you can’t make assumptions about homogeneity, statistical arguments about the aggregate behaviour of electrons become unreliable, etc.


We known that machine learning gets attention of physicists these days, and although some of these applications look unrealistic or far-fetched, it sees to me from your description that condensed matter physics can be attacked from this direction, and it might be feasible.

You start with 1D problems, move to small 3D volumes, and build from there. ML is good at "interpolating" or guessing what the answer must be from a given input. Maybe, one will need many nested levels of ML models to realistically simulate a solid-state material, but this is not entirely impossible, I think.

If such an ML simulation can be done reasonably efficiently, then likely there could be a theory that can be formulated in terms of equations, approximating the "theory of everything" with sufficient detail.


A lot of people are trying to explore modern ML methods in physics, but they’ve so far had limited success at least in terms of interesting theoretical results. One glaring problem is that ML methods tend to be black boxes, so even if someone solved all of condensed matter physics with some super neural network, the question would remain “how and why was this able to work?” “What is going on physically?” which is what physicists tend to actually care about more than being able to make a prediction.

Believe me though, people are trying.


On the other hand, there is plenty of work on rule extraction from trained neural networks being done. Wouldn't it be amazing if we extracted actual laws of physics from a network trained by observations?

Yes, I know that.

What I'm saying is that if an ML model can approximate quickly the solution, then there could be a simple theory expressed in terms of equations, approximating the full problem. I.e. if there is an efficiently computable procedure, then it's a sign that there might be a good simple approximating theory.


The standard model is basically a description of the fundamental forces, and is usually talked about by particle physicists studying things like the Higgs, quarks, etc. In condensed matter (materials) the only relevant interactions are electromagnetism and quantum effects, strong and weak interactions play no role because of the length scales at play (angstroms to nm) and the energy scales (meV to eV). Superconductivity involves phonons (vibrations) and electrons, at least in BCS, so you will never really find anyone discussing the 'Standard model' and superconductivity because the former is fundamental, while superconducting materials require consideration of emergent and many-particle effects which come about by having so many particles in a solid. Not sure I cleared anything up but hope that helps.

Like most of quantum chemistry, I imagine the Standard Model works just fine for simulating superconductivity to whatever degree of accuracy we (practically) need. The issue is finding an algorithm that can efficiently perform the calculation itself. I don't know much about condensed matter physics, but I know that for my own research in the past, something called the fermion sign problem (NP-hard) made accurate quantum chemical calculations extremely difficult.

Thanks for mentioning the term that I was able to find:

https://en.wikipedia.org/wiki/Numerical_sign_problem

https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.94...

"Computational Complexity and Fundamental Limitations to Fermionic Quantum Monte Carlo Simulations, Matthias Troyer and Uwe-Jens Wiese, Phys. Rev. Lett. 94, 170201, 4 May 2005"

"Quantum Monte Carlo simulations, while being efficient for bosons, suffer from the “negative sign problem” when applied to fermions — causing an exponential increase of the computing time with the number of particles. A polynomial time solution to the sign problem is highly desired since it would provide an unbiased and numerically exact method to simulate correlated quantum systems. Here we show that such a solution is almost certainly unattainable by proving that the sign problem is nondeterministic polynomial (NP) hard, implying that a generic solution of the sign problem would also solve all problems in the complexity class NP in polynomial time."

https://arxiv.org/abs/cond-mat/0408370


It should be noted as well that the sign-problem shows up for many (most of the interesting ones anyway) bosonic systems as well.

NP hard means that a quantum computing will not solve this problem efficiently either, as that's BQP class.

(We do not know how to implement any MA or QMA efficiently.)


These quantum Monte Carlo simulations are only necessary because we don't have quantum simulators [1], so yes while a quantum computer would not necessarily solve the sign problem, it would make it irrelevant.

[1] https://en.wikipedia.org/wiki/Quantum_simulator


QMC computes multiple superimposed versions of classical MC at the same time, giving a polynomial speedup, not good enough for NP hard problems.

QMC cannot speed up simulation of a different quantum system - unless you're simulating a strict subset of your specific machine.

Quantum simulator is not a QMC system - it is designed to exhibit same behavior as a modelled system. That makes it somewhat useless for a system whose properties you do not understand. Any grid of fermions does not act like any other - if it did, BCS version would be accurate.


I don't think you understand what quantum monte carlo is or why it works. Nearly everything you've said is just plain incorrect.

Furthermore, I never said quantum simulation was QMC. I said that having quantum computers able to simulate many-body Hamiltonians would make QMC irrelevant, or at the very least would not suffer the same limitations that QMC suffers, specifically for strongly correlated fermion problems.

I suggest getting a stronger grasp on the literature before you go around making claims like this. Whether you know it or not, you're actively spreading misinformation and making the internet a worse place.


I only heard about high temperature superconductor theory through one seminar + lab, and I too got the impression that nobody really understood this stuff -- at least not good enough to make solid predictions, and to explain it to a physics undergrad.

I'm glad somebody else shares this sentiment, and it wasn't just me being dense.


I really wish physics articles would talk about the actual measurements. It sounds like they are using shrunken gnomes to observe electrons. Is it spectroscopy? Simulation? I think the truth would be a lot less exciting than what the article claims, but if you’re going to lie, might as well just write science fiction. Science journalism has never been great, but now it’s intentionally misleading.

The article says they use UV photons to kick electrons off the surface, and then measure the trajectories of the electrons

That would be angle resolved photoemission spectroscopy (ARPES). Read about it here: https://arpes.stanford.edu/research/tool-development/angle-r...

The TL;DR is that you can kick electrons out of a material with light. You can study what comes off and work back to what the electrons were doing in the sample. At the most basic level it tells you "this many electrons were moving in this direction with this energy".


Interesting to see this alongside "Physicists Have Identified a Metal That Conducts Electricity But Not Heat" which is talking about another material where electrons exhibit collective behavior, "moving as a fluid". Perhaps there are a number of situations where electrons coordinate within a metallic lattice, but something additional is required to produce superconductivity. Cooper pairs could be degenerate (or at least very simple) forms of this collective behavior.

Or maybe vanadium oxide is a room temperature superconductor, it's just that any amount of current destroys the effect unless it's really cold, cold enough for everything to stay put.


> ... at 4.2 Kelvin (4.5 degrees above absolute zero, ...)

Is my understanding wrong (or out of date) or the article bad at this point?

I thought the Kelvin scale was designed such that its base is absolute zero so 4.2 Kelvin would be 4.2 degrees above not 4.5?

EDIT: Actually, looking at the num-pad on my keyboard, I'm now assuming that the "4.5" is simply a slip-of-the-finger typo rather than a mistake in understanding.


also,

4.2 K = -452.11 F = -268.95 C


Why would you need a quantum computer to simulate behavior that has to do with quantum mechanics?

That's... the whole point of quantum computing, and why Feynman [co-]invented the field. Classical computer simulations of quantum systems cannot scale. Quantum computer simulations of quantum systems can. That means that if you want to simulate a quantum system, you need a quantum computer. Or you can simply observe the quantum system and attempt to infer details of how it works from the observations. Or you could choose not to understand some quantum system, but that's not what scientists and engineers would choose.

My interpretation is slightly different. Classical computers are slow because they need to simulate the collapse of a wave-function using probabilities with something like Monte Carlo. Quantum computers don't simulate anything, you build a replica wave function and then measure it classically, forcing the collapse to happen in nature. But this has an obvious hard limit where you can't simulate more qbits than your quantum computer has.

Probably two things. On one hand, our theoretical understanding of quantum effects may not have come far enough to write a traditional simulation program. Or doing so would take years or even millions of years, or both.

Quantum computers can simulate things that are not possible with traditional computers. These problems are in the complexity class BQP. I want to point out though, that this is very likely a result of our limited understanding, and not something that is theoretically impossible.


How do you know the quantum simulation will match the actual phenomenon. There seems to be a leap of logic there.

How do we know that classical numerical simulation will match the phenomena... by experimentation.

Experimentation in the quantum world has this nasty habit of changing what you are trying to observe.

Why use a strong violent word - "Attack"? This kind of journalism bothers me. "New research questions Superconductivity Theory" should be sufficient. "Attack" is a poor word choice IMHO.

"attacking" a theory is a very common expression used in many scientific literature.

So what scale of QC would be needed to do this work?

This looks like it could be a huge turning point for the understanding of matter at temperatures we love.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: