
LHCb discovers matter-antimatter asymmetry in charm quarks - rbanffy
https://www.symmetrymagazine.org/article/lhcb-discovers-matter-antimatter-asymmetry-in-charm-quarks
======
greesil
"The idea that matter and antimatter particles behave slightly differently is
not new and has been observed previously in studies of particles containing
strange quarks and bottom quarks. What makes this study unique is that it is
the first time this asymmetry has been observed in particles containing charm
quarks."

~~~
pytyper2
What is the difference and what was this information blocking us from
discovering?

~~~
crdrost
So we have these matter particles that are defined by three numbers, a weak
hypercharge Y in the set {1, 1/3, -1/3, -1}, a weak isospin T in the set {1/2,
-1/2}, and a ‘generation’ in the set {0, 1, 2}. The electric charge is a
derived quantity Q = T + Y/2 that does not depend on generation. For
generation 0 we have the particles,

    
    
          T     Y     Q    name
        -----|-----|-----|----------
        -1/2   -1    -1    electron
        -1/2  -1/3  -2/3   antiüp quark
        -1/2   1/3  -1/3   down quark
        -1/2    1     0    antineutrino
         1/2   -1     0    neutrino
         1/2  -1/3   1/3   antidown quark
         1/2   1/3   2/3   up quark
         1/2    1     1    positron
    

The weak interaction has to preserve these quantum numbers, so for example
when a free neutron (up-down-down) turns into a proton (up-up-down) and an
electron, as it will do if you leave it alone for about 15 minutes, then one
of the down quarks is turning into an up quark and an electron. This preserves
electric charge but it does not preserve the underlying quantum numbers, so it
requires emitting an antineutrino. [The fact that you need 4 particles total
is part of why it takes a long time on the order of minutes; in this case the
Feynman diagram vertexes only have three lines going in/out and so creating a
4-particle state requires two of them, in the middle you have a W- boson,
(T=-1, Y=0).]

And then there are some things which are not present but fit the quantum
numbers, for example many grand unified theories predict something which is
spectacularly unobserved called “proton decay” where an up (1/2, 1/3) could
hypothetically annihilate with a down (-1/2, 1/3) to generate an antiüp (-1/2,
-1/3) plus a positron (1/2, 1)—this would manifest as a proton decaying into a
neutral pion (up-antiüp) plus a positron, which would presumably be hugely
energetically favored (protons have several times the mass of pions and
elecron/positron masses are negligible)... this sort of decay does not have a
way to happen in the standard model because there is no interim (0, 2/3)
particle to sit between the two vertices.

Anyway, the next generations up are basically copies of the same 8 matter
particles, with "electron" replaced by "muon" and then "tau", "neutrino"
replaced by "mu neutrino" and then "tau neutrino," "down" replaced by
"strange" and then "bottom", and "up" replaced by "charm" and then "top". The
down and up quarks typically cannot decay into anything without some antidown
or antiüp quarks sitting around to annihilate with them, though again, this is
not 100% obvious from the table above, as the case of proton decay shows. So
that we have observed this with strange and bottom quarks is two out of our
four possibilities.

So what this makes very clear is that the CP-violations are _not_ something
specific to the (-1/2, 1/3) / (1/2, -1/3) antiparticle pairs that are called
(anti-)down, (anti-)strange, (anti-)bottom in the three generations. It is not
some sort of physics phenomenon that requires these two signs to be opposite;
it has now been observed in the (-1/2, -1/3) / (1/2, 1/3) antiparticle pairs,
too. Assuming that the presence in the bottom quark means that this asymmetry
crosses generation lines, then we are all but assured that the much harder to
measure top quarks would also display the asymmetry, and it is something very
fundamental, rather than some as-yet-unappreciated aspect of the coupling of
isospin to hypercharge.

~~~
sprayk
Thank you so much for this. Not only did this description help me understand
what this discovery actually meant, it clarified and tied together a lot of
what I have been trying to learn through occasional reading and youtube videos
on the subject. The table of particles and associated values and the
description of the rules of decay bridged a huge gap I've had in understanding
all of this for a really long time.

When you say "The fact that you need 4 particles total is part of why it takes
a long time on the order of minutes...", does it take a long time because
there are significantly fewer decays (described by Feynman diagrams?) from a
lone neutron that result in a proton, anti-neutrino, and electron than there
are decays that end up back at a neutron?

~~~
crdrost
So like it wouldn't be a decay if it went neutron → neutron, if that makes
sense. There is one “main” diagram which goes neutron → neutron and it looks
like a straight line with no vertices and it is by far the most probable
thing, most neutrons just stay neutrons.

So there are two reasons that a free neutron outside of a nucleus takes so
long to become a proton, and you can kind of visualize it like pulling a
molecule of air through an air filter or so, the first reason that this
particular setup takes so long is that this particular air filter is really
thick, and the second reason is that the fan you're using is not very strong.

The “wall being thick” has to do with this intermediate particle, and that’s
what I was alluding to above. The wall is thick because you need to create
this W- boson. The problem is that this boson has about twice the mass of the
neutron itself, call it Bohb because it’s a Big Ol’ Honking Boson. There's
just nowhere near the energy in the system to create this thing directly. And
in quantum mechanics that is _okay_ because quantum systems can “tunnel”
through states that they cannot directly actually occupy: but it generally
takes longer and longer the more and more energy you need to borrow, and this
is a lot of energy to borrow.

The other thing is the weak blower, and that has to do with what “pressure” or
“energy difference” drives the decay. In this case the driver is the mass
difference: down-quarks are just intrinsically about 2 MeV heavier than up-
quarks and that is enough to cover the 0.5 MeV of an electron and a neutrino.,
so you have something like 1.5 MeV left over to spread across the universe. By
itself that number doesn't mean anything, though—what means something is the
ratio of the initial to the final masses, which is something like 939.57 MeV :
938.78 MeV, so the final mass is only 0.08% lighter than the initial mass. The
reaction rate goes like some high power—a fifth or sixth power—of this ratio,
so when one side has like half the mass of the other side then the reaction
happens very very fast because there is so much pressure driving it. But in
this case the masses are so close to equal that the reaction takes something
like hundreds of times longer than you might otherwise expect from just the
thickness of the barrier alone.

------
Maro
Quote:

These observations have confirmed the pattern of CP violation described in the
Standard Model by the so-called Cabibbo-Kobayashi-Maskawa (CKM) mixing matrix,
which characterises how quarks of different types transform into each other
via weak interactions. The deep origin of the CKM matrix, and the quest for
additional sources and manifestations of CP violation, are among the big open
questions of particle physics. The discovery of CP violation in the D0 meson
is the first evidence of this asymmetry for the charm quark, adding new
elements to the exploration of these questions.

------
saagarjha
Unrelated, but

> Precision studies of antihydrogen atoms, for example, have shown that their
> characteristics are identical to hydrogen atoms to beyond the billionth
> decimal place.

the wording for this is a bit ambiguous, since it’s not clear if the decimal
place is for the billionths or if it’s the actual billionth place to the right
of the decimal point.

~~~
lmilcin
I guess to one billionth, because nothing can be measured to billionth decimal
place.

~~~
saagarjha
Yeah, I’m sure that’s what they meant, but I still feel like it could have
been worded better…

~~~
dhimes
Agree. Also, "billion" is ambiguous across countries. So, 10^{-9} would have
been great (or is it 10^{-12}?).

~~~
Navarr
In which countries is 10^12 still billion?

My immediate research only renders that it used to be this in British English

~~~
lagadu
Every non-English speaking one, in my very anecdotal and limited experience.
I've never seen it used like that outside the US and sometimes UK.

No offense meant but it's usually shown as an example of the "Americans can't
count" stereotype: 1,000,000,000 is a thousand millions.

~~~
jerf
Both systems are broken. The British system is a bit more nominally
consistent, but the names are poor; million and milliard are stupidly
phonetically similar for what they are (generally we prefix those differences,
probably because it's the ends of words that tend to get slurred, hence we
have milli-meters and not meter-millis), and staggering the names out two
chunks of 3 at a time doesn't really make much sense. It isn't very user-
focused.

The American naming system uses roots more clearly and doesn't have the weird
pointless bundling together of two groups of 3, _but_ it has a bizarre off-by-
one issue... and not the usual 0->1 or 1->0 issue, but a 1->2 issue. For
consistency, the order ought to be "ones, millions, billions, trillions,
quadrillions", so that the prefix on the digit counter indicates the number of
factors of 1000 in question, from zero, one, bi=two, tri=three, etc. However,
"thousands" get stuck in there wrecking the whole thing up, so where the names
say you have 1 group of 1000, you in fact have two, and so on.

It's not an imperial vs. metric sort of thing, it's more an arguing which is
the "real" temperature, Fahrenheit or Celsius, when in fact the answer is
basically neither because the "real" temperature scale ought to have its 0 at
absolute zero, like Kelvin [1] or the lesser-known Rankine [2], which is
basically "Kelvin, except the degree is 1 degree Fahrenheit". These are both
more "real" because now you can add and subtract temperatures meaningfully,
which you can't do with either of Fahrenheit or Celsius. And likewise, neither
number system is abstractly all that great. But then, that's part of why we
have scientific notation.

[1]: Which I just learned is about to be redefined, as of May 20th, 2019:
[https://en.wikipedia.org/wiki/Kelvin#2019_redefinition](https://en.wikipedia.org/wiki/Kelvin#2019_redefinition)

[2]:
[https://en.wikipedia.org/wiki/Rankine_scale](https://en.wikipedia.org/wiki/Rankine_scale)

~~~
ajuc
What's wrong with the long scale? It's pretty consistent. Each name is the
last times 1000, each new prefix is 1 000 000 times the previous one, and each
-iliard is 1000 times -ilion with same prefix.

    
    
        milion = 10^6, miliard = 10^9
        bilion = 10^12, biliard = 10^15
        trylion = 10^18, tryliard = 10^21
        kwadrylion = 10^24, kwadryliard = 10^27, ...
    

Compared to that American way is just insane (as always).

But I agree that writing 1.23 * 10^6 is preferable.

~~~
jerf
To put what I said another way, it emphasizes the wrong things. Nobody cares
how many collections of 6 digits something has. 3 digits are what people care
about. (Except where they care about 4, or actually vary the number of digit
groupings they care about depending on where they appear, but then, this
debate has no meaning to them anyhow.)

It's consistent, but it's consistent with something that doesn't match
people's usage.

Both systems are roughly equally broken, so either side mocking the other for
their number system is a display of parochialism above and beyond the usual
levels one would see.

~~~
toast0
> Nobody cares how many collections of 6 digits something has. 3 digits are
> what people care about.

I respectfully offer the Lakh [1] (1,00,000) and the Crore [2] (1,00,00,000)
for other things people care about. :)

[1] [https://en.wikipedia.org/wiki/Lakh](https://en.wikipedia.org/wiki/Lakh)
[2] [https://en.wikipedia.org/wiki/Crore](https://en.wikipedia.org/wiki/Crore)

------
ainar-g
99.9999% is almost 5σ, right? Have we finally discovered New Physics?

Edit: The CERN[1] article says that it is in fact 5.3σ.

[1]: [https://home.cern/news/press-release/physics/lhcb-sees-
new-f...](https://home.cern/news/press-release/physics/lhcb-sees-new-flavour-
matter-antimatter-asymmetry)

~~~
ymolodtsov
C-symmetry is already broken in SM, so I guess we haven't.

~~~
pif
Not only C: even CP!

------
japhyr
I am unfamiliar with the concept of CP violation, and this description was
really helpful:

[https://www.nevis.columbia.edu/daedalus/motiv/cp.html](https://www.nevis.columbia.edu/daedalus/motiv/cp.html)

------
dukwon
This is a great result and a real milestone in particle physics. CERN is
really ablaze with the news (seriously, there was a fire:
[https://i.imgur.com/i8nkPMR.jpg](https://i.imgur.com/i8nkPMR.jpg))

Stay tuned for more high-profile LHCb results at Moriond tomorrow and on
Tuesday.

~~~
antonvs
> CERN is really ablaze with the news (seriously, there was a fire

Will you be here all week?

------
lelf
Live seminar
[https://webcast.web.cern.ch/event/i807176](https://webcast.web.cern.ch/event/i807176)

Paper [https://cds.cern.ch/record/2668357/files/LHCb-
PAPER-2019-006...](https://cds.cern.ch/record/2668357/files/LHCb-
PAPER-2019-006.pdf)

------
westurner
So, does this disprove all of supersymmetry?
[https://en.wikipedia.org/wiki/Supersymmetry](https://en.wikipedia.org/wiki/Supersymmetry)

~~~
whatshisface
No, supersymmetry and charge-parity symmetry are different.

~~~
westurner
Ah, thanks.

"CPT Symmetry"
[https://en.wikipedia.org/wiki/CPT_symmetry](https://en.wikipedia.org/wiki/CPT_symmetry)

"CP Violations"
[https://en.wikipedia.org/wiki/CP_violation](https://en.wikipedia.org/wiki/CP_violation)

"Charm quark"
[https://en.wikipedia.org/wiki/Charm_quark](https://en.wikipedia.org/wiki/Charm_quark)
:

> _The antiparticle of the charm quark is the charm antiquark (sometimes
> called anticharm quark or simply anticharm), which differs from it only in
> that some of its properties have equal magnitude but opposite sign._

------
adrianN
It would be very exciting if the Standard Model can't explain this.

~~~
lelf
_In summary, this Letter reports the first observation of a nonzero CP
asymmetry in charm decays, using large samples of D⁰ → K⁻K⁺ and D⁰ → π⁻π⁺
decays collected with the LHCb detector. The result is consistent with,
although at the upper end of, SM expectations, which lie in the range
10⁻⁴–10⁻³ [8–13]. Beyond the SM, the rate of CP violation could be enhanced.
Unfortunately, present theoretical understanding does not allow very precise
predictions to be made, due to the presence of strong-interaction effects
which are difficult to compute. In the next decade, further measurements with
charmed particles, along with possible theoretical improvements, will help
clarify the physics picture, and establish whether this result is consistent
with the SM or indicates the presence of new dynamics in the up-quark sector._

from the paper (links in my other comment). SM = standard model.

~~~
gnulinux
I think a layman summary of this is "it seems consistent with Standard Model,
although at the upper end of it, but we're not sure yet."

