Matt Strassler [July 11, 2012]
Currently, the data from the ATLAS and CMS experiments at the Large Hadron Collider [LHC] combined with theoretical arguments make it seem likely (to me, at least, and to most theorists I’ve spoken with) that the new particle discovered by ATLAS and CMS is a Higgs particle of some type. But the crucially important question is this: Is it the simplest Higgs, the so-called Standard Model Higgs, or not?
A lot is at stake. If it is the simplest type of Higgs, then the Standard Model — the equations that successfully describe all the known elementary particles and a Higgs of the simplest type — may be the complete story of physics at the Large Hadron Collider [LHC]. There may literally be nothing in the LHC data at the ATLAS and CMS experiments other than one successful test after another of the Standard Model — leaving its many unanswered questions (what determines the strengths of the forces, the masses of the particles, and the detailed patterns of particle decays) for future generations to ponder. But if there is even the slightest thing about nature’s Higgs particle that is not exactly as predicted for a simplest Higgs, than this just by itself would imply that there are new particles and/or forces not included in the Standard Model. This would be a revolutionary discovery, as the Standard Model (with gravity, dark matter and neutrino masses added on) has been our best bet for four decades. Information about the new forces or particles would then be obtained from further studies of the Higgs and searches for those particles and forces at the LHC. What we’d learn from these new phenomena is impossible to guess.
There’s a logical problem that you should keep in mind when understanding what data can and cannot say about this question. Sure, if the new particle differs substantially from the Standard Model Higgs, we may soon be able tell from the data; but if the new particle is a Standard Model Higgs, then the only way to tell is to gradually rule out all other possibilities with more and more data. It’s like this: imagine trying to determine whether or not the amount of money in your pocket is 235 dollars and 78 cents. If you do a quick count of your fifty-dollar and twenty-dollar bills, you may easily and right away see that you have at least $300 dollars, or no more than about $100, and in that case you immediately know that there’s no way you could have $235.78; you’ve ruled out the hypothesis. But to determine that you really have $235.78, no more, no less, requires a careful count of the five-dollar and one-dollar bills and all of your quarters, dimes, nickels and pennies. It may or may not be easy to rule out the hypothesis right away, or at least relatively quickly; but to rule it in to the extent possible requires the maximum amount of work and the highest precision that can be obtained. [And you can’t ever rule it in to the exclusion of all other possibilities, because there’s always the possibility that the hypothesis has been cleverly mimicked; for example, suppose one of your one-dollar bills is a fake that you’d need special equipment to recognize as such? On purely logical grounds, the best you can achieve is to get close enough to rule out all of the most plausible alternatives.] This is why — just from logic — it may take the LHC the rest of the decade to bring us to a fairly convincing (but never beyond-a-shadow-of-a-doubt convincing) conclusion. You can’t learn more than logic allows — sorry.
Where we stand right now
Roughly, all we can say right now [and we’ll look at the data in a moment] is that
- the data roughly resembles what would be expected of a Standard Model Higgs,
- it is therefore not possible to say the new particle is not a Standard Model Higgs,
- many possible alternatives to the simplest Higgs have now been ruled out by the data, though many others still remain.
There have been a number of separate measurements made already: each one consists of looking to see what the rate appears to be for a Higgs particle to be produced in a particular way and then to subsequently decay in a particular way. Some of this information in the ATLAS and CMS data is shown in the figure below. (ATLAS does not yet separate the data into as many categories as does CMS, which is why there are fewer points; all production modes are combined on this plot. ATLAS has shown the two-photon data split into categories, but this is not shown here.) Each black point shows the amount of the corresponding Higgs signal, relative to the expectation for a Standard Model Higgs, that one has to add to the prediction of the Standard Model with no Higgs particle in order to match the data. If there’s no Higgs, the data should on average sit at 0; if there’s a Standard Model Higgs, the data should cluster around 1; if there’s some other Higgs particle in nature, the data might lie scattered anywhere greater than or equal to 0. Each point also has an uncertainty band in red (CMS) or black (ATLAS), showing how large are the uncertainties of one standard deviation up and down from the black point; clearly some measurements are easier than others and have much smaller uncertainties.
What we see is that not one of the data points differs by three standard deviations (or “sigmas”) from the prediction of the Standard Model Higgs (where the horizontal axis reads “1” — note this is the dashed blue line in the ATLAS figure but lies just to the right of the right edge of the green band in the CMS figure). Nor is there any significant trend in the data to lie well above or well below the expectation. Roughly, this leads to the first two statements in violet above. To get the third statement requires looking at the various alternatives; I’ll comment briefly on this at the very end of the article.
Are there substantive deviations from the Standard Model Higgs hypothesis?
Before even starting to look further at the data, let us remember that the new results are “preliminary”, which means “subject to small changes before the paper actually is submitted to a journal for publication.” Sometimes small changes can have a bigger effect on statistical significance than you might expect, especially on the more difficult measurements and on measurements with low numbers of events.
With that caveat, what are the largest deviations from the Standard Model Higgs hypothesis?
First, there is a deviation upward in the signal for Higgses that decay to two photons; this is a 2 sigma deviation at ATLAS; a similar effect is also seen at CMS, though smaller. In short, the bumps in the two-photon data are more distinct than expected, especially at ATLAS. This could mean that (1) more Higgses are being produced than expected, or (2) that more are decaying to photons than expected. It could be a statistical effect that will go away with more data, or it might just mean that the calculation of the production rate for a Standard Model Higgs is a bit too small, and since this calculation is actually very difficult and quite subtle, that possibility is a real one. So papers that are instead assuming that this deviation is a sign that the Higgs is decaying to two photons more often than in the Standard Model might be basing their result on a combination of a statistical fluke and a perhaps underestimated theoretical uncertainty. [The very tenuous connections between current results on the Higgs and any particular variant of supersymmetry, which for some bizarre reason were reported in the press, are completely untrustworthy at this point; this is a case of seeing what you want to see long before the evidence is clear.] We need both more data (to reduce the statistical uncertainties) and a careful look at the theoretical predictions; and then what we really need even more is a measurement of the ratio of the two-photon rate to the four-lepton rate, which can be calculated with higher precision, and can distinguish the two possibilities… assuming that the deviation even is still there when more data is gathered.
Second, there is a deviation downward at CMS in the data used to look for Higgs particles produced in the p p –> q q H process (vector boson fusion, or VBF) [or p p –> H g g, see this post for more discussion of this subtlety)] and decaying to a tau lepton/anti-lepton pair. ATLAS sees a deviation downward also, but their result currently has less data and is not specifically focused on the VBF region. Notice that CMS does not see a downward deviation for Higgs particles decaying to taus and produced via non-VBF processes! (Or more precisely, not in the VBF selection region of the data.) But the deviation at CMS for VBF production is quite substantial; the relevant black dot lies at -2 times the Standard Model Higgs hypothesis, i.e. it says that the data is consistent with the Standard Model without the Higgs minus twice what we’d expect from the Higgs. We certainly aren’t producing negative numbers of Higgs particles!
So while the data is not particularly consistent with the Standard Model including the Standard Model Higgs particle signal, the data is also not particularly consistent with the Standard Model without the Higgs. This is a warning sign that we’d better look at the data more closely; the statistical significance of the negative result may be a misleading indicator. [Blindly taking statistical significance as the only quantity of importance in evaluating one’s confidence in a scientific result is a bit like using test scores (such as an SAT score in the U.S.) as your only measure for evaluating how promising students are, or the number of publications or an “h-index” for evaluating the quality of faculty; it may make you feel good to have a quantitative comparative test, but it is not going to give you reliable results if it’s the only thing you consider.]
In the figure, you can see the data for the search at CMS for Higgs particles produced via VBF and decaying to taus. The backgrounds are in color; the signal of Standard Model Higgs, multiplied by five, is shown in the white bars; in short, the signal is tiny compared to the background!
The plot at lower right looks for collisions in which the tau and anti-tau decay to a muon and anti-muon; the background, in blue, is huge relative to the signal, dominated by Z particles decaying to muon and anti-muon, and there’s no way to see anything significant here.
The other three plots show a much smaller background; in these cases, where the tau and anti-tau decay in different ways, the background has a big bump that comes from Z particles that decay to tau and anti-tau. You notice that the Higgs signal is not itself a bump; it is a shoulder on the Z peak. And therefore, any mis-measurement of the Z peak can lead to a problem with the Higgs measurement. Is there any hint of such an issue? Yes: In all three plots, especially those at upper right and at lower left, we see excess events at masses below the peak and a deficit of events above the peak (where the small number of Higgs events would be expected), as though the masses of the events that should be in the Z peak are being slightly underestimated.
How are these masses obtained? In each case, the energies and momenta of detected particles are combined with the “missing transverse momentum” (the measured momentum imbalance in the event, ostensibly due to undetected particles but also receiving contributions from mismeasurements of the detected particles) to form the mass. And a Higgs from VBF production will be accompanied by jets (from high-energy quarks) that appear at relatively small angles from the beampipe, in regions where measurements are potentially subject to larger sources of error. So — would I personally be surprised if this preliminary and somewhat hurried result (remember how little time they had between the end of data-taking in mid-June and the presentation of the result in early July) had a minor problem with the mass measurement that has led to a small distortion of the background, one that is currently being interpreted as the absence of a Higgs signal in this channel? No.
Summing up and looking ahead
All of this is to say that we do see some deviations from the Standard Model Higgs hypothesis, but
- they aren’t even that statistically significant yet
- even if they were, there are reasons to be concerned about uncertainties from other sources than just statistics
And so, for now, until we have a lot more data (and we should have a chunk of additional results from ATLAS soon, and from both experiments in the fall and then in the winter or spring, before the LHC takes a hiatus for most of 2013 and most or all of 2014 for repairs, adjustments, and higher collision energy) all we can say is what I wrote in violet above; nothing rules out the Standard Model Higgs; any theory that predicted that a Higgs would be found that deviated wildly from the simplest Higgs is now dead; but many other theories remain on the table. Specifically, many theories (including various theories with multiple Higgs particles, theories with accessible “hidden” particles, some variants of supersymmetry models, assorted little Higgs models, and more) have the property that the Higgs particle (or one of several Higgs particles) will rather closely resemble the Standard Model Higgs, too closely to be distinguished with current data. Many of these theories remain viable, for now, and only more and more precise measurements will gradually rule them out, until either the Standard Model Higgs hypothesis finally fails, or we reach the point many years from now, when LHC measurements reach their maximal possible precision, with the Standard Model Higgs hypothesis still intact.