"Vital" is completely subjective but I'd throw stuff around quantum information into the ring. Maybe you'd consider the loop-hole free Bell tests performed in 2015 and awarded the 2022 Nobel prize to count?
I think the prize in 2022 was a nice prize, but it could still be considering just tidying the corners. In the end it just proved that things really work as most of us has thought it worked for decades.
100% agreed as I can't think of any one individual since(1) who has done as much for all of science and engineering as he ultimately did; alas, they are not awarded posthumously.
(1) Newton would be a strong contender on a "for all time" basis, but even he would've probably needed to share it with Leibniz, which would have driven him absolutely ~b o n k e r s~, like wet hornet in a hot outhouse mad, LOL.
I think this is (very) inaccurate. It feels more like them trying to jump on a "hot topic" bandwagon (machine learning/AI hype is huge).
Physics as a discipline hasn't really stalled at all. Fundamental physics arguably has, because no one really has any idea how to get close to making experimental tests that would distinguish the competing ideas. But even in fundamental physics there are cool developments like the stuff from Jonathan Oppenheim and collaborators in the last couple of years.
That said "physics" != "fundamental physics" and physics of composite systems ranging from correlated electron systems, and condensed matter through to galaxies and cosmology is very far from dead.
I don't know exactly what they hope to gain by jumping on that bandwagon though; neither the physicists nor the computer scientists are going to value this at all. And dare I say, the general populace associated with the two fields isn't going to either - case in point, this hn post.
If there weren't any noble-worthy nominations for physics, maybe skip it? (Although that hasn't happened since 1972 across any field)
I kinda doubt it. The kind of people who end up nominating people for Nobels or even making the decisions on these aren't really struggling for grant funding.
But the system they have succeeded in optimises for people who can sell themselves well enough to get that funding. These people live and breathe selling themselves for funding. Every buzzword, sexy plot, and dynamic presentation has got them here and it's not like they plan to stop.
There's no need to skip it, there's probably a big backlog from previous shortlists :)
But yeah, they could have passed. That would have been cool.
Also, there's a ton of extremely amazing shit in astronomy, or even photolithography, or simulations of physics (though that's basically what the chemistry prize was this year).
I just briefly looked into what Jonathan Oppenheim is working on, and I’d say he’s part of the problem. More speculative work that might or might not be testable in a distant future.
It used to be that there was some experimental result or other phenomena that required explanation which lead to a theoretical model that could be tested. That worked very well.
Now there’s some theoretical considerations that leads to a theoretical model that can’t be tested. It didn’t work for Aristotle and it doesn’t work for string theorists (and similar).
Why doesn't this experimental result count as requiring explanation?
We know (for example) silver atoms have mass, and that massive objects exert gravity (which we understand as warping of space-time according to GR).
We know that we can put silver atoms in quantum superpositions of being in different positions (for example in a sequential Stern-Gerlach type experiment).
We have (essentially) absolutely no theoretical understanding of what is going on to space-time when a thing with mass is in such a superposition. Quantum mechanics does not successfully model gravity, and general relativity contains no superpositions, so the situation is completely beyond our theoretical understanding. This isn't a theoretical consideration, this is something real that you can do in an undergrad physics lab experiment pretty easily.
Now the problem is that the models we have developed so far to deal with this situation turned out to be (wildly) too difficult for us to test. I think it is very far from clear that the Oppenheim & co model falls into this category - imo its completely reasonable for them to be spending theoretical effort working out what is needed to test their model.
Because it's not an experimental result. There are two disparate experimental results, one about superpositions and one about gravity. There's no experimental result about gravity being or not being in superpositions. What will happen to gravity (if there is any) in a double split experiment is pure theoretical speculations.
And I readily admit that it would be interesting to know what would happen. But many decades of more or less convoluted hypotheses has proved to be unfruitful. We need a new way to do fundamental physics, or if possible go back to the old way, because the current one clearly doesn't work.
Its probably mostly because you have an intuitive idea that there is some concept of "now" which is independent of the observer.
In special relativity this global "now" isn't a thing. It doesn't exist. There is no global now. Different observers who are in different places and/or moving at different speeds will describe different events as simultaneous.
In particular say we have an observer who sees an event A happening at time 0, and a second event (call it B) at time t and the distance between them is greater than c t. Then you can find observers who see A happening first, B happening first or the two happening at the same time. However all observers will agree that the distance between the events was greater than c times the time between them.
This seems like it would cause problems with causality, but it doesn't because we need the distance to be greater than c times the time, which means no lightspeed signal could get from A to B. If you allow ftl communication then this "escape" doesn't work anymore, and causality can be explicitly broken.
Thanks. The reason still seem to be related to the uncertainty principle although I am not sure.
The same way they explain no-cloning but it seems to be analogous to identity within a system with interaction from neighboring data.
Ultimately there is no pure independent state. Data always exists within context. Hence causality and spatial preservation (no instant physical teleportation as far as is currently understood). (in very layman's terms)
While the second and third parts if your comment are complete true, the first part
> Entanglement isn't particularly useful for communication
I would say is false. Entanglement lets you do some fun and theoretically useful stuff for communication tasks. At the most basic level sharing entanglement lets you upgrade a classical communication channels you have into a quantum one (sending 2 bits and burning an entangled pair lets you send a qubit). You can do increasing fancy stuff if you so wish, if you are sufficiently paranoid you might be interested in device independent cryptography, which is only possible because of entanglement.
Isn't it true that in key exchange entanglement isn't used in any way shape or form for sending data, but only in making a determination that there was no eavesdropping on the transmission, because any eavesdropper would collapse the wave function.
So like you said, entanglement can't be used to send information, but it can be used to detect if the transmission was secure (I think)
There are many protocols for quantum key distribution/exchange so it's hard to answer fully without knowing which one you're talking about. That said their are protocols, like the one invented by Artur Ekert in 1991, which use entanglement in an essential way to transmit the key. Even in the absence of an evesdropper the protocol will not work without entanglement. It escapes the no-communication theorem by also requiring some classical communication.
Right, if you expand the scope of the discussion to other areas other than sending bits, there are various ways entanglement is used in various protocols. But none of them utilize entanglement to be able to get a bit from Alice to Bob faster than light can go.
This is a comment I made on hackernews, replying to someone who (IIRC) claimed to be German but (in my opinion) was clearly a Russian astroturfing account.
After I mentioned it they deleted their comments, and they have since all been flagged by the moderators here.
Ah, but have you considered the fact that he's undergone a sex change operation, and was actually originally a female, the birth mother? Elementary, really...
I wonder if this interpretation is a result of attempts to make the model more inclusive than the corpus text, resulting in a guess that's unlikely, but not strictly impossible.
I think its more likely this is just an easy way to trick this model. It's seen lots of riddles, so when it's sees something that looks like a riddle but isn't one it gets confused.