Higgs Discovery: Is it a Simplest Higgs?

Matt Strassler [July 11, 2012]

Currently, the data from the ATLAS and CMS experiments at the Large Hadron Collider [LHC] combined with theoretical arguments make it seem likely (to me, at least, and to most theorists I’ve spoken with) that the new particle discovered by ATLAS and CMS is a Higgs particle of some type.  But the crucially important question is this: Is it the simplest Higgs, the so-called Standard Model Higgs, or not?

A lot is at stake.  If it is the simplest type of Higgs, then the Standard Model — the equations that successfully describe all the known elementary particles and a Higgs of the simplest type — may be the complete story of physics at the Large Hadron Collider [LHC].  There may literally be nothing in the LHC data at the ATLAS and CMS experiments other than one successful test after another of the Standard Model — leaving its many unanswered questions (what determines the strengths of the forces, the masses of the particles, and the detailed patterns of particle decays) for future generations to ponder.  But if there is even the slightest thing about nature’s Higgs particle that is not exactly as predicted for a simplest Higgs, than this just by itself would imply that there are new particles and/or forces not included in the Standard Model.  This would be a revolutionary discovery, as the Standard Model (with gravity, dark matter and neutrino masses added on) has been our best bet for four decades.  Information about the new forces or particles would then be obtained from further studies of the Higgs and searches for those particles and forces at the LHC.  What we’d learn from these new phenomena is impossible to guess.

There’s a logical problem that you should keep in mind when understanding what data can and cannot say about this question.  Sure, if the new particle differs substantially from the Standard Model Higgs, we may soon be able tell from the data; but if the new particle is a Standard Model Higgs, then the only way to tell is to gradually rule out all other possibilities with more and more data.  It’s like this: imagine trying to determine whether or not the amount of money in your pocket is 235 dollars and 78 cents.  If you do a quick count of your fifty-dollar and twenty-dollar bills, you may easily and right away see that you have at least $300 dollars, or no more than about $100, and in that case you immediately know that there’s no way you could have $235.78; you’ve ruled out the hypothesis.  But to determine that you really have $235.78, no more, no less, requires a careful count of the five-dollar and one-dollar bills and all of your quarters, dimes, nickels and pennies.  It may or may not be easy to rule out the hypothesis right away, or at least relatively quickly; but to rule it in to the extent possible requires the maximum amount of work and the highest precision that can be obtained[And you can’t ever rule it in to the exclusion of all other possibilities, because there’s always the possibility that the hypothesis has been cleverly mimicked; for example, suppose one of your one-dollar bills is a fake that you’d need special equipment to recognize as such?  On purely logical grounds, the best you can achieve is to get close enough to rule out all of the most plausible alternatives.]  This is why — just from logic — it may take the LHC the rest of the decade to bring us to a fairly convincing (but never beyond-a-shadow-of-a-doubt convincing) conclusion.  You can’t learn more than logic allows — sorry.

Where we stand right now

Roughly, all we can say right now [and we'll look at the data in a moment] is that

  1. the data roughly resembles what would be expected of a Standard Model Higgs,
  2. it is therefore not possible to say the new particle is not a Standard Model Higgs,
  3. many possible alternatives to the simplest Higgs have now been ruled out by the data, though many others still remain.

There have been a number of separate measurements made already: each one consists of looking to see what the rate appears to be for a Higgs particle to be produced in a particular way and then to subsequently decay in a particular way.  Some of this information in the ATLAS and CMS data is shown in the figure below.  (ATLAS does not yet separate the data into as many categories as does CMS, which is why there are fewer points; all production modes are combined on this plot.  ATLAS has shown the two-photon data split into categories, but this is not shown here.)  Each black point shows the amount of the corresponding Higgs signal, relative to the expectation for a Standard Model Higgs, that one has to add to the prediction of the Standard Model with no Higgs particle in order to match the data.  If there’s no Higgs, the data should on average sit at 0; if there’s a Standard Model Higgs, the data should cluster around 1; if there’s some other Higgs particle in nature, the data might lie scattered anywhere greater than or equal to 0.  Each point also has an uncertainty band in red (CMS) or black (ATLAS), showing how large are the uncertainties of one standard deviation up and down from the black point; clearly some measurements are easier than others and have much smaller uncertainties.

Results from CMS (left) and ATLAS (right) on various searches for the new particle being produced and decaying in particular ways. With infinite data and perfect measurements and perfect calculations, the black dots would all lie at 1 for a Standard Model with a Higgs particle at a mass of about 125 GeV, and at 0 for a Standard Model with no Higgs particle. (Note for the CMS plot the black vertical line and green bar show the best fit of the data, not the Standard Model Higgs hypothesis at 1; for the ATLAS plot the blue dotted line is the Standard Model hypothesis, not the best fit.) The horizontal bars on each point show the uncertainty on the measurement (a single standard deviation up or down). That none of the points is three or more standard deviations away from 1, and there is no obvious trend among the points away from 1, means that the Standard Model Higgs hypothesis remains viable. See text for discussion of the most deviant points.

What we see is that not one of the data points differs by three standard deviations (or “sigmas”) from the prediction of the Standard Model Higgs (where the horizontal axis reads “1” — note this is the dashed blue line in the ATLAS figure but lies just to the right of the right edge of the green band in the CMS figure).  Nor is there any significant trend in the data to lie well above or well below the expectation.  Roughly, this leads to the first two statements in violet above.  To get the third statement requires looking at the various alternatives; I’ll comment briefly on this at the very end of the article.

Are there substantive deviations from the Standard Model Higgs hypothesis?

Before even starting to look further at the data, let us remember that the new results are “preliminary”, which means “subject to small changes before the paper actually is submitted to a journal for publication.”  Sometimes small changes can have a bigger effect on statistical significance than you might expect, especially on the more difficult measurements and on measurements with low numbers of events.

With that caveat, what are the largest deviations from the Standard Model Higgs hypothesis?

First, there is a deviation upward in the signal for Higgses that decay to two photons; this is a 2 sigma deviation at ATLAS; a similar effect is also seen at CMS, though smaller.  In short, the bumps in the two-photon data are more distinct than expected, especially at ATLAS. This could mean that (1) more Higgses are being produced than expected, or (2) that more are decaying to photons than expected.  It could be a statistical effect that will go away with more data, or it might just mean that the calculation of the production rate for a Standard Model Higgs is a bit too small, and since this calculation is actually very difficult and quite subtle, that possibility is a real one.  So papers that are instead assuming that this deviation is a sign that the Higgs is decaying to two photons more often than in the Standard Model might be basing their result on a combination of a statistical fluke and a perhaps underestimated theoretical uncertainty.  [The very tenuous connections between current results on the Higgs and any particular variant of supersymmetry, which for some bizarre reason were reported in the press, are completely untrustworthy at this point; this is a case of seeing what you want to see long before the evidence is clear.]  We need both more data (to reduce the statistical uncertainties) and a careful look at the theoretical predictions; and then what we really need even more  is a measurement of the ratio of the two-photon rate to the four-lepton rate, which can be calculated with higher precision, and can distinguish the two possibilities… assuming that the deviation even is still there when more data is gathered.

Second, there is a deviation downward at CMS in the data used to look for Higgs particles produced in the p p –> q q H process (vector boson fusion, or VBF) [or p p –> H g g, see this post for more discussion of this subtlety)] and decaying to a tau lepton/anti-lepton pair.  ATLAS sees a deviation downward also, but their result currently has less data and is not specifically focused on the VBF region.  Notice that CMS does not see a downward deviation for Higgs particles decaying to taus and produced via non-VBF processes! (Or more precisely, not in the VBF selection region of the data.) But the deviation at CMS for VBF production is quite substantial; the relevant black dot lies at -2 times the Standard Model Higgs hypothesis, i.e. it says that the data is consistent with the Standard Model without the Higgs minus twice what we’d expect from the Higgs.  We certainly aren’t producing negative numbers of Higgs particles!

So while the data is not particularly consistent with the Standard Model including the Standard Model Higgs particle signal, the data is also not particularly consistent with the Standard Model without the Higgs.  This is a warning sign that we’d better look at the data more closely; the statistical significance of the negative result may be a misleading indicator.  [Blindly taking statistical significance as the only quantity of importance in evaluating one's confidence in a scientific result is a bit like using test scores (such as an SAT score in the U.S.) as your only measure for evaluating how promising students are, or the number of publications or an ``h-index'' for evaluating the quality of faculty; it may make you feel good to have a quantitative comparative test, but it is not going to give you reliable results if it's the only thing you consider.]

CMS data showing the search for Higgs decaying to taus from the vector-boson-fusion process (see text). A tau itself will decay to neutrinos plus an electron, a muon, or hadrons. The four plots show data from four classes of events that show signs of containing a tau lepton/anti-lepton pair, where the tau and anti-tau decay in various ways. Each plot shows the best estimate of the mass of the Higgs candidate as reconstructed from the observed particles and the missing momentum (ostensibly from the neutrinos). In each plot except that at lower right, the large yellow peak is from Z particles decaying to a tau lepton/anti-lepton pair. The prediction of the Standard Model Higgs hypothesis, times five (!!!), is shown as the white bars.

In the figure, you can see the data for the search at CMS for Higgs particles produced via VBF and decaying to taus.  The backgrounds are in color; the signal of  Standard Model Higgs, multiplied by five, is shown in the white bars; in short, the signal is tiny compared to the background!

The plot at lower right looks for collisions in which the tau and anti-tau decay to a muon and anti-muon; the background, in blue, is huge relative to the signal, dominated by Z particles decaying to muon and anti-muon, and there’s no way to see anything significant here.

The other three plots show a much smaller background; in these cases, where the tau and anti-tau decay in different ways, the background has a big bump that comes from Z particles that decay to tau and anti-tau.  You notice that the Higgs signal is not itself a bump; it is a shoulder on the Z peak.  And therefore, any mis-measurement of the Z peak can lead to a problem with the Higgs measurement.  Is there any hint of such an issue?  Yes: In all three plots, especially those at upper right and at lower left, we see excess events at masses below the peak and a deficit of events above the peak (where the small number of Higgs events would be expected), as though the masses of the events that should be in the Z peak are being slightly underestimated.

How are these masses obtained?  In each case, the energies and momenta of detected particles are combined with the “missing transverse momentum” (the measured momentum imbalance in the event, ostensibly due to undetected particles but also receiving contributions from mismeasurements of the detected particles) to form the mass.  And a Higgs from VBF production will be accompanied by jets (from high-energy quarks) that appear at relatively small angles from the beampipe, in regions where measurements are potentially subject to larger sources of error.  So — would I personally be surprised if this preliminary and somewhat hurried result (remember how little time they had between the end of data-taking in mid-June and the presentation of the result in early July) had a minor problem with the mass measurement that has led to a small distortion of the background, one that is currently being interpreted as the absence of a Higgs signal in this channel?  No.

Summing up and looking ahead

All of this is to say that we do see some deviations from the Standard Model Higgs hypothesis, but

  • they aren’t even that statistically significant yet
  • even if they were, there are reasons to be concerned about uncertainties from other sources than just statistics

And so, for now, until we have a lot more data (and we should have a chunk of additional results from ATLAS soon, and from both experiments in the fall and then in the winter or spring, before the LHC takes a hiatus for most of 2013 and most or all of 2014 for repairs, adjustments, and higher collision energy) all we can say is what I wrote in violet above; nothing rules out the Standard Model Higgs; any theory that predicted that a Higgs would be found that deviated wildly from the simplest Higgs is now dead; but many other theories remain on the table.  Specifically, many theories (including various theories with multiple Higgs particles, theories with accessible “hidden” particles, some variants of supersymmetry models, assorted little Higgs models, and more) have the property that the Higgs particle (or one of several Higgs particles) will rather closely resemble the Standard Model Higgs, too closely to be distinguished with current data.  Many of these theories remain viable, for now, and only more and more precise measurements will gradually rule them out, until either the Standard Model Higgs hypothesis finally fails, or we reach the point many years from now, when LHC measurements reach their maximal possible precision, with the Standard Model Higgs hypothesis still intact.

18 responses to “Higgs Discovery: Is it a Simplest Higgs?

  1. Pingback: Higgs Link Dump | Whiskey…Tango…Foxtrot?

  2. Thanks so much for the great summary. We’re all extremely impatient to know more :)

    You wrote: “Specifically, many theories (including various theories with multiple Higgs particles, theories with accessible “hidden” particles, some variants of supersymmetry models, assorted little Higgs models, and more) have the property that the Higgs particle (or one of several Higgs particles) will rather closely resemble the Standard Model Higgs, too closely to be distinguished with current data.”

    Purely out of curiosity, I’d love an article someday summarizing the various options and how they could or could not be ruled out depending on what is found about the Higgs boson. I’ve read about supersymmetry on this site and others, but I’m less familiar with the others, especially Little Higgs models. (I read the summary on Wikipedia, but that was totally incomprehensible to me.) I know this is a tall order…just thought I’d suggest it :) Thanks again!

  3. Pingback: ¿El bosón de Higgs descubierto en el LHC es el predicho por el modelo estándar? « Francis (th)E mule Science's News

  4. Pingback: Trotz des neuen Bosons: warum die eigentlichen Entdeckungen – hoffentlich – erst noch kommen « Skyweek Zwei Punkt Null

  5. Has fermiophobic higgs been ruled out yet with latest data? I am right to think fermiophobic higgs is a model where the higgs doesn’t couple directly to the fermions and so the mass of the fermions aren’t generated by the higgs?

  6. If the energy in the collision is greater than the vev ( 246Gev ) would one expect that during the collision the higgs mechanism would be effectively switched off and the particles behave as if they were massless?

    If that was true would one expect contributions from maybe unknown particles that would otherwise have been too massive to participate?

    • Yes and no.

      The symmetries among the W particle, Z particle and photon become obvious again when the energy is large compared to 246 GeV, but that means going up to around 1 TeV of energy per particle, or more. BUT the mechanism is not switched off, it just becomes less important for some physical questions, while remaining crucial for others.

      For example, the processes quark + anti-quark → two photons, or two Z’s, or W+ W-, all become very similar at high energy. However, the W and Z remain massive and still decay in a trillionth of a trillionth of a second, no matter how much energy they have, while the photon is always massless and stable.

  7. Doesn’t the Higgs mass itself contradict the SM? AFAIK, in the SM, the Higgs lighter then 130 GeV makes vakuum unstable.

    • 125 GeV for the Higgs mass, within the context of the SM (Standard Model), makes the universe metastable. But that’s fine. Metastable can mean a lifetime of a billion billion billion years… plenty long enough. Unstable is several GeV lower than 125, I forget just now the precise number that people are quoting. And moreover, there is slop in this number. One new phenomenon could change the calculation enough to make it stable. Add a little additional physics at 10^14 GeV and no one will be the wiser (for the moment), but our universe may then be stable. And simply knowing that some kind of unspecified physics has to be added somewhere between 10^5 GeV and 10^18 GeV isn’t really very useful information. Especially since we already know that the Standard Model leaves many questions unanswered (why 3 generations of matter particles? why do they have the specific pattern of masses that we see? why isn’t there CP violation in the strong nuclear interaction? etc. etc. etc.)

  8. Pingback: LHC Program Evolving: From Broad Searches To Precision Tests | Of Particular Significance

  9. Pingback: Heading Home | Of Particular Significance

  10. the Siliconopolitan

    leaving its many unanswered questions (what determines the strengths of the forces, the masses of the particles, and the detailed patterns of particle decays) for future generations to ponder.

    Are these necessarily questions that have answers? Are there any compelling arguments for why the masses, say, can’t just be free parameters?

  11. Pingback: Higgs Results at Kyoto | Of Particular Significance

  12. Pingback: Solution to Yesterday’s Puzzle: Higgs, Found! | Of Particular Significance

  13. Pingback: It’s (not) The End of the World | Of Particular Significance

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s