A brief mention today of a new measurement from the BABAR experimental collaboration, which tests lepton universality (to be explained below) and finds it wanting.

**The Basic Story (Slightly Oversimplified)**

Within the Standard Model of particle physics (the equations that describe and predict the behavior of all of the known particles and forces), the effects of the weak nuclear force on the three leptons — the electron, the muon and the tau — are all expected to be identical. This called “lepton universality”. This expectation has been tested many times; for instance, the probability that a negatively charged W particle decays to an electron plus an electron anti-neutrino is the same as for it to decay to a muon plus a muon anti-neutrino, and the same for a tau plus a tau anti-neutrino, to very good precision.

Another place (see Figure 1) to test “lepton universality” is in the decay of a hadron containing a bottom quark to a hadron containing a charm quark plus a lepton plus a corresponding anti-neutrino. (Specifically, the focus here is on hadrons called “mesons” which contain a bottom quark or charm quark, an up or down anti-quark,and, as for all hadrons, lots of gluons and quark-antiquark pairs.)

These decays occur via the weak nuclear force (more precisely, through the effect of a W “virtual particle” [which is not really a particle at all, but a more generalized disturbance in the W field]) so the probability for such a decay in which a tau is produced should be related to the probability for the case where a muon or electron is produced. The meson with a bottom quark is called a B, the meson with a charm quark is called a D, so for shorthand the decays to the three leptons and their corresponding anti-neutrinos are called B → D e ν, B → D μ ν and B → D τ ν.

However, testing lepton universality in this particular context is a little tricky, because the masses of the three leptons are different — because their interactions with the Higgs field are not universal — and in this class of decays, the difference matters. Whereas a W particle has a mass of 80 GeV/c^{2}, so big that the tau lepton’s 1.8 GeV/c^{2} mass is too small to play any role in the W’s decay, the B and D mesons have masses around 5.3 and 1.9 GeV/c^{2} respectively, and the difference in those masses is not that much bigger than the tau’s mass. As a result, although the probability of B → D e ν and B → D μ ν are expected to be (and are measured to be) the same, the probability for B → D τ ν is not expected to be the same as for B → D e ν. But, with some difficulty, the difference in these probabilities can be predicted using the Standard Model’s equations. And recently the relative probabilities have been measured, by the BABAR experiment (which studied B mesons produced in the collisions of electrons and positrons at the Stanford Linear Accelerator Center in California; the experiment was shut down in 2008, but data analysis continues.) **The measurement differs from the prediction, coming in too high by what can arguably be said to be 3.4 standard deviations (3.4 “σ”.)**

That’s the short story. The real story is quite a bit longer, as always.

**Some Additional Important Details**

First, this is not a completely straightforward measurement. τ leptons can’t be measured directly, because they decay very quickly. In this measurement, they are detected only when they decay to electrons or to muons, plus a neutrino and anti-neutrino that go undetected. Wait, you say… how can we tell the difference between B → D e ν and B → D τ ν if the τ then decays to an e plus undetected neutrinos?! The reason is that on average the electron from the τ decay has lower energy (and there are other tricks too.) Even better, an electron-positron collider is a precision machine, unlike a hadron collider like the LHC, so the total energy and total momentum of the undetected neutrinos can be inferred from the measurement of everything that *is* detected. If there’s only one neutrino or anti-neutrino, as in B → D e ν, then (because the neutrino mass is so tiny) you would expect the missing momentum and missing energy to satisfy E_{miss} = p_{miss} c to the available precision. If there are three neutrinos or anti-neutrinos, as in B → D τ ν, you expect the E_{miss} > p_{miss} c. Technically, we define “missing mass” by m_{miss}^{2} c^{4} = E_{miss}^{2} – p_{miss}^{2}c^{2} ; it is substantially non-zero only for the decays to taus. (The name is misleading; it isn’t mass that is missing, but the invariant mass of the undetected particles.)

So in the end it is possible to look for B → D + e + invisible particles, and separate (on average, statistically — see Figure 2) the ones that have a higher energy lepton and no “missing mass” that come from B → D e ν from the lower energy ones with “missing mass” that come from B → D τ ν. Unfortunately, the measurement is made more complicated by the presence of some other unrelated processes that give a similar-looking signal. But these effects are believed to be known to sufficient precision as to not interfere with the measurement. [Nevertheless, this appears to be one of the weakest points in the measurement — though I don’t yet know all the details of the measurement technique and so I might be wrong.]

Second, I’ve oversimplified; there are two types of common B and D mesons; those with spin 0 are called B and D, while those with spin 1 are called B* and D*. The experimenters have measured both B → D* τ ν relative to B → D e ν and B → D τ ν relative to B → D* e ν. Both come out high, by 2.0 and 2.7 σ. Only when combined together does one obtain the claimed 3.4 σ.

R(D) = ratio of rate for B → D τ ν relative to the sum of rates for B → D e ν + B → D μ ν + B → D τ ν.

- Measured 0.440 ± 0.058 ± 0.042 ;
- Predicted 0.297 ± 0.017

R(D*) =ratio of rate for B → D* τ ν relative to the sum of rates for B → D* e ν + B → D* μ ν + B → D* τ ν.

- Measured: 0.332 ± 0.024 ± 0.018 .
- Predicted: 0.252±0.003

Here the first and second error bars on the measurements are statistical and systematic uncertainties, and the error on the prediction indicates known uncertainties in the calculation.

**Implications, If Any**

Is this excess a real effect that indicates a breakdown of the Standard Model? We’ve seen plenty of measurements with 3 σ deviations from the Standard Model prediction in the past year (see here and here for examples), but in some cases it looks as though the measurement may have been wrong, or simply a statistical fluctuation, while in others it looks as though the original theoretical prediction may have wrong. We can hope this new measurement indicates something real, but ** as usual, while it is worth taking note of, it is too early to get very excited**. We need to see confirmation by another experiment and some additional theoretical checks on the predictions.

If it is a real effect, what could cause it? One obvious possible source of a violation of universality could be a new electrically-charged Higgs particle, not predicted in the Standard Model but predicted by many theories that have multiple Higgs fields and particles. (Among them is supersymmetry.) In simple versions of this idea, a charged Higgs virtual particle (which isn’t really a particle, but a more general disturbance in the charged Higgs field) would contribute to these processes (Figure 3), and change the B → D τ ν rate more than the other two, because it interacts much more strongly with the heavier tau lepton than with the electron or muon. BABAR claims that the simplest version of this idea doesn’t fit their data, but there are a lot of subtleties with that remark (in particular the presence of a new particle changes not only the decay probabilities but also the energy distributions shown in Figure 2, so the data has to be reanalysed carefully for each possible type of charged Higgs.)

Are there any other possibilities? Good question. One thing that is worth noting is that this effect (at least as currently measured) is very large — I’m referring here not its statistical significance of 3.4 σ, which is moderate at best, but the size of the effect itself, something like a 50% excess over expectation from the Standard Model prediction. It is not easy to get an effect that big from anything indirect, such as a complicated quantum mechanical process. In fact, even a direct effect from a single virtual particle, along the lines of the charged Higgs idea, has to be on the large side — especially since any simple charged (non-virtual!) particle of this type has to be quite heavy (certainly heavier than 100 GeV/c^{2}, from LEP collider measurements, and probably closer to 150, from LHC measurements). Making a theory with such a large effect consistent with all other existing measurements may be a tall order.

*I thank Andrey Katz for bringing this paper to my attention.*

Thanks a lot for this explanation.

I think the word “calculated” is missing from the sentence “the difference in these probabilities can be …”, in the paragraph above “That’s the short story.”

“Are there any other possibilities?” Bryan Sanctuary, J. F. Geurdes, J. Christian, M. Nagasawa, and others have attacked the physical validity of Bell’s theorem.

http://en.wikipedia.org/wiki/Bell_theorem

If Bell’s theorem is not physically valid, then not only might the HIggs field concept fail but even the Copenhagen interpretation might not be 100% correct.

Irrelevant to this experiment.

wow. I wonder if Belle can squeeze something out by ICHEP 😉

Great post, as usual.

Just a little correction: you mention that lepton universality e/tau has been checked experimentally in W decays to a good precision, but there’s actually a long standing 2-3 sigma deviation between experiment (mainly LEP) and the lepton universality predicted by the standard model. See arXiv:1203.2092 for a recent review on this issue.

Thanks!

Bell’s theorem is related to the foundations of quantum theory, which might be relevant to the existence of the Higgs field and also to every aspect of the Standard Model of particle physics. Is the Higgs field necessary for explaining the masses of the leptons and quarks? According to Prof. (Emeritus) Gerald Rosen of Drexel U., “All lepton and quark masses appear directly related to the electron mass.”

http://home.comcast.net/~gerald-rosen/Pub%20280.pdf “Self-interaction mass formula that relates all leptons and quarks to the electron”, March 13, 2012

Bell’s theorem is just not relevant here.

And Mr. Rosen’s work is one of thousands of similar probably misguided attempts… at most one of which (and probably zero of which) is correct.

The truth about physics shall emerge but the truth about the history of physics might not emerge.

‘In retrospect, I wish we had added the true statement — “after this paper was completed, related work by EB and H was brought to our attention.” We were naïve enough to believe that these other articles offered no threat to our insights or to the crediting of our contribution.’ — Gerald Guralnik

http://arxiv.org/pdf/0907.3466v1.pdf “The History of the Guralnik, Hagen and Kibble development of the Theory of Spontaneous Symmetry Breaking and Gauge Particles”

what about instead Right-handed current?

[see: arXiv:1007.1993, arXiv:0907.2461]

Pingback: LHC Program Evolving: From Broad Searches To Precision Tests | Of Particular Significance

Pingback: LHC Program Evolving: From Broad Searches To Precision Tests | Of Particular Significance