UPDATE 10/26: In the original version of this post, I stupidly forgot to include an effect, causing an error of a factor of about 5 in one of my estimates below. I had originally suggested that a recent result using ALEPH data was probably more powerful than a recent CMS result. But once the error is corrected, the two experiments appear have comparable sensitivity. However, I was very conservative in my analysis of ALEPH, and my guess concerning CMS has a big uncertainty band — so it might go either way. It’s up to ALEPH experts and CMS experts to show us who really wins the day. Added reasoning and discussion marked in green below.
In Friday’s post, I highlighted the importance of looking for low-mass particles whose interactions with known particles are very weak. I referred to a recent preprint in which an experimental physicist, Dr. Arno Heister, reanalyzed ALEPH data in such a search.
A few hours later, Harvard Professor Matt Reece pointed me to a paper that appeared just two weeks ago: a very interesting CMS analysis of 2011-2012 data that did a search of this type — although it appears that CMS [one of the two general purpose detectors at the Large Hadron Collider (LHC)] didn’t think of it that way.
The title of the paper is obscure: “Search for a light pseudo–scalar Higgs boson produced in association with bottom quarks in pp collisions at 8 TeV“. Such spin-zero “pseudo-scalar” particles, which often arise in speculative models with more than one Higgs particle, usually decay to bottom quark/anti-quark pairs or tau/anti-tau pairs. But they can have a very rare decay to muon/anti-muon, which is much easier to measure. The title of the paper gives no indication that the muon/anti-muon channel is the target of the search; you have to read the abstract. Shouldn’t the words “in the dimuon channel” or “dimuon resonance” appear in the title? That would help researchers who are interested in dimuons, but not in pseudo-scalars, find the paper.
Here’s the main result of the paper:
At left is shown a plot of the number of events as a function of the invariant mass of the muon/anti-muon pairs. CMS data is in black dots; estimated background is shown in the upper curve (with top quark backgrounds in the lower curve); and the peak at bottom shows what a simulated particle decaying to muon/anti-muon with a mass of 30 GeV/c² would look like. (Imagine sticking the peak on top of the upper curve to see how a signal would affect the data points). At right are the resulting limits on the rate for such a resonance to be produced and then decay to muon/anti-muon, if it is radiated off of a bottom quark. [A limit of 100 femtobarns means that at most two thousand collisions of this type could have occurred during the year 2012. But note that only about 1 in 100 of these collisions would have been observed, due to the difficulty of triggering on these collisions and some other challenges.]
[Note also the restriction of the mass of the dimuon pair to the range 25 GeV to 60 GeV. This may have done purely been for technical reasons, but if it was due to the theoretical assumptions, that restriction should be lifted.]
While this plot places moderate limits on spin-zero particles produced with a bottom quark, it’s equally interesting, at least to me, in other contexts. Specifically, it puts limits on any light spin-one particle (call it V) that mixes (either via kinetic or mass mixing) with the photon and Z and often comes along with at least one bottom quark… because for such particles the rate to decay to muons is not rare. This is very interesting for hidden valley models specifically; as I mentioned on Friday, new spin-one and spin-zero particles often are produced together, giving a muon/anti-muon pair along with one or more bottom quark/anti-quark pairs.
But CMS interpreted its measurement only in terms of radiation of a new particle off a bottom quark. Now, what if a V particle decaying sometimes to muon/anti-muon were produced in a Z particle decay (a possibility alluded to already in 2006). For a different production process, the angles and energies of the particles would be different, and since many events would be lost (due to triggering, transverse momentum cuts, and b-tagging inefficiencies at low transverse momentum) the limits would have to be fully recalculated by the experimenters. It would be great if CMS could add such an analysis before they publish this paper.
Still, we can make a rough back-of-the-envelope estimate, with big caveats. The LHC produced about 600 million Z particles at CMS in 2012. The plot at right tells us that if the V were radiated off a bottom quark, the maximum number of produced V’s decaying to muons would be about 2000 to 8000, depending on the V mass. Now if we could take those numbers directly, we’d conclude that the fraction of Z’s that could decay to muon/anti-muon plus bottom quarks in this way would be 3 to 12 per million. But sensitivity of this search to a Z decay to V is probably much less than for a V radiated off bottom quarks [because (depending on the V mass) either the bottom quarks in the Z decay would be less energetic and more difficult to tag, or the muons are less energetic on average, or both.] So I’m guessing that the limits on Z decays to V are always worse than one per hundred thousand, for any V mass. (Thanks to Wei Xue for catching an error as I was finalizing my estimate.)
If that guess/estimate is correct, then the CMS search does not rule out the possibility of a hundred or so Z decays to V particles at each of the various LEP experiments. That said, old LEP searches might rule this possibility out; if anyone knows of such a search, please comment or contact me.
As for whether Heister’s analysis of the ALEPH experiment’s data shows signs of such a signal, I think it unlikely (though some people seemed to read my post as saying the opposite.) As I pointed out in Friday’s post, not only is the excess too small for excitement on its own, it also is somewhat too wide and its angular correlations look like the background (which comes, of course, from bottom quarks that decay to charm quarks plus a muon and neutrino.) The point of Friday’s post, and of today’s, is that we should be looking.
In fact, because of Heister’s work (which, by the way, is his own, not endorsed by the ALEPH collaboration), we can draw interesting if rough conclusions. Ignore for now the bump at 30 GeV/c²; that’s more controversial. What about the absence of a bump between 35 and 50 GeV/c²? Unless there are subtleties with his analysis that I don’t understand, we learn that at ALEPH there were fewer than ten Z decays to a V particle (plus a source of bottom quarks) for V in this mass range.
That limits such Z decays to about 2 to 3 per million. OOPS: Dumb mistake!! At this step, I forgot to include the fact that requiring bottom quarks in the ALEPH events only works about 20% of the time (thanks to Imperial College Professor Oliver Buchmuller for questioning my reasoning!) The real number is therefore about 5 times larger, more like 10 to 15 per million. If that rough estimate is correct, it would provide a more powerful constraint than constraint roughly comparable to the current CMS analysis.
[[BUT: In my original argument I was very conservative. When I said “fewer than 10”, I was trying to be brief; really, looking at the invariant mass plot, the allowed numbers of excess events for a V with mass above 36 GeV is typically fewer than 7 or even 5. And that doesn’t include any angular information, which for many signals would reduce the numbers to 3. Including these effects properly brings the ALEPH bound back down to something close to my initial estimate. Anyway, it’s clear that CMS is nipping at ALEPH’s heels, but I’m still betting they haven’t passed ALEPH — yet.]]
So my advice would be to set Heister’s bump aside and instead focus on the constraints that one can obtain, and the potential discoveries that one could make, with this type of analysis, either at LEP or at LHC. That’s where I think the real lesson lies.