Matt Strassler 12/11/11 (Updated now with figures. This was written while suffering from jet lag, so it might not yet be as clear as I’d like. Let me know what you can’t follow.)
In this article, I’m going to explain how we search for the Standard Model Higgs particle (the simplest possible type of Higgs particle that might be present in nature) and, if we find a candidate for this particle, how we check whether it really is of Standard Model type, or whether it is a look-alike that is in fact more complicated.
In particular, I’m going to combine together what I have told you about the production of the Standard Model Higgs particle at the Large Hadron Collider [LHC] with the decays of the Standard Model Higgs particle. These can be mixed and matched: any of the possible production processes that makes a Higgs can be followed by any of the possible decays of the Higgs. Unfortunately quite a few of the resulting combinations are impractical to measure at the LHC, even with the giant data sets that will be available a decade from now. But from the many that can be measured over the coming decade, a lot of information about the Higgs can be obtained, and the question of whether a candidate Higgs particle really is or is not a Standard Model Higgs particle can be answered with reasonable confidence.
First, of course, we have to find the darn thing, if it is there. Since lightweight Standard Model Higgs particles (with mass-energy [E = m c2] of 115-141 GeV) are in the spotlight right now, let’s focus on how to find them.
The search for the lightweight Standard Model Higgs particle (including crucial subtleties!)
Although there are five production processes for a Standard Model Higgs particle, we simply don’t have enough data yet (and won’t anytime soon) to pick out the four smaller ones. Right now (as long as the Higgs is really of Standard Model type — otherwise this might not be true) the dominant way that the Higgs is made at the LHC is via two gluons colliding to make a Higgs particle, which proceeds via an indirect effect involving top quark/antiquark “virtual particles”. Both ATLAS and CMS have made many tens of thousands of Higgs particles already. But as I described here, most of them decay in ways that cannot be detected, at least not easily! Only the decays of the Higgs to two photons, to two W particles, and to two Z particles can be reliably detected with the current amount of data, and even then only when the W’s or Z’s themselves decay in a favorable way. Certain other decay modes are more common, but there is so much background to these signals that they cannot be observed, at least not with the current amount of data.
In a few moments, I will point you to an article where I went into much more pedagogical detail, but let me first summarize the pros and cons of each of the three main processes that are used in the search for a lightweight Standard Model Higgs particle (in the 115-141 GeV range):
1. Higgs –> ZZ –> two charged lepton/anti-lepton pairs (specifically the leptons in question are electrons or muons, not taus, which are much more difficult to handle):
Pros:
- The signal is extremely clean: because the leptons and anti-leptons can be measured precisely, the mass that a Higgs particle would have had to have, if it produced the lepton/anti-lepton pairs in its decay, can be determined precisely, to within 3 or 4 GeV.
- Meanwhile the background is very small, still less than one event for each 2 GeV-wide Higgs mass bin as of the end of 2011.
Cons:
- For smaller Higgs masses, the probability for this process becomes very small. For a 125 GeV Standard Model Higgs particle, only about 1 in 12,000 Higgs particles decays this way.
- Experimentally, the probability of detecting all four leptons and anti-leptons is not that high, so the number of Higgs particles detected this way, for each experiment, might currently (end-2011) be zero, one, or two, and probably not more.
Still, although not convincing, two events at the same Higgs mass (as of end-2011) would be noteworthy, because the background to this process is so small.
2. Higgs –> two photons:
Pros:
- The signal is extremely clean: because the photons can be measured precisely, the mass that a Higgs particle would have had to have, if the photons were produced in a Higgs decay, can be known precisely, to within 1 or 2 GeV.
- The rate is not quite so small; about 1 in 500 Higgs particles decays this way.
- The background does not need to be calculated; it can be measured.
Cons:
- The background is much larger than for the previous case, perhaps 400-500 events (as of end-2011) for each 1 GeV-wide Higgs mass bin.
- The larger size of the background implies that a lot of two photon events from Higgs particles must be observed before they will stand out from the crowd.
Generally, the lighter the Standard Model Higgs particle, the more important is the two-photon decay for finding it. But to find the Higgs particle in the two-photon measurement requires a lot of data; we have just enough now to start having a chance of seeing it. It was easier to look for the Standard Model Higgs in the somewhat heavier range, where the decay to two W’s and to two Z’s was more common, and that’s why this range was excluded first.
3. Higgs –> W W –> lepton/anti-neutrino + neutrino/anti-lepton (where again these leptons include only electrons and muons, not taus):
Pros:
- The signal is relatively large: 1 in 100 Standard Model Higgs particles with a mass of 125 GeV would decay this way.
- It is much larger for heavier Higgs particles, growing to nearly 1 in 20 for a Higgs mass of 160 GeV, and it played the dominant role in ruling out the Standard Model Higgs from 141 to 190 GeV.
Cons:
- The presence of the two neutrinos means that some amount of information about each collision is unmeasurable, and the mass of a Higgs that might have been responsible for what is observed in that collision cannot be uniquely determined.
- The background is large, has various components, and cannot be easily measured.
- The theoretical calculation of some part of the background is not simple, and it is difficult and controversial to determine just how uncertain the calculation is.
- Another (smaller) part of the background is either difficult or impossible to calculate and must be measured.
- Determining the missing momentum of the neutrinos requires precise measurements of everything else in the collision, not just the charged lepton and anti-lepton. There needs to be considerable confidence, on the part of the experimenters, that they are doing this correctly.
- The signal appears as a small excess above the background and (unlike the other two cases just described) does not have the distinctive shape of a narrow peak above a simple background.
I want to emphasize the pros and cons of the Higgs –> WW decay, which played a very big role this summer. Many of you may know there were hints at the July conference in Grenoble that a Standard Model-like Higgs might be present in the 140-150 GeV range. But as I explained in an article later that week, these hints rested on uncertain ground— precisely because they relied mainly on the WW measurement. I discussed the three crucial processes (decays to photons, ZZ and WW) in a lot of detail, showing how they are measured experimentally and what the data will look like after it is collected, and I explained why I would not trust the hints from Higgs –> WW as they were back then until signs of the Higgs showed up in either or both of the other two processes.
And indeed, I was right not to trust them. As happens much more often than not, those hints went away — though why they went away isn’t entirely clear. [Was the hint just a statistical fluctuation that CMS and especially ATLAS both happened to have? Or was there in fact an error in how various WW backgrounds were being calculated or measured, and that affected the later analyses in Mumbai and in Paris? This issue has been controversial (my Rutgers colleagues even published a paper pointing out a plausible problem) and the details are not public; perhaps we will never know the full story.]
But in any case, Higgs –>WW was a problem this summer, and there is a possibility it is going to be a problem on December 13th for a similar reason. Fortunately, even if this is the case, these problems will eventually go away once we have more data. If the Higgs particle really is there at 125 GeV or something like that, we will reach the point in a few months where evidence for the Higgs particle is so strong in the processes H –> two photons and/or H –> two lepton/anti-lepton pairs that we don’t need to use any information from the WW decay mode anymore. I suspect we won’t get to that point until the middle of 2012, but perhaps the December 13th presentations will surprise me.
More technically, for those who have scientific backgrounds: the Higgs –> WW decay suffers from substantial theoretical and systematic uncertainties. The other two processes do not; they have mainly statistical uncertainties. Statistical analysis of the significance of a signal is straightforward only when the dominant uncertainties are statistical, because statistical uncertainties are essentially random, while systematic and theoretical uncertainties generally are not.
So I encourage you to read that July article, in order to learn some important lessons from the summer’s events — ones that still might be relevant this coming week, and will certainly be on my mind as I try to understand the experiments’ presentations on Tuesday.
How we study the Higgs particle once we have found it
Once we’ve got a candidate for a Higgs particle, the story has just begun. The only thing we know about the object will be its mass, and a rough measure of its production rate times its decay rate for the one or two processes that we’ve observed initially. There’s a lot more we need to learn, and a lot more we can learn, from the LHC.
How do we even know this new particle actually is a Higgs particle of some type (not necessarily the Standard Model version of the Higgs particle)?
The thing that makes a particle a Higgs particle, by definition, is that the field in which it is a ripple participates in giving mass to the W and Z particles. (Actually the definitions are a little slippery here; what one really should say, to avoid confusion, is that the Higgs “sector” might consist of several closely related particles, all of which are typically called Higgs particles, and at least one of which must have the property I just described.) If a field gives the W and Z particles all or part of their masses, then the field’s particle will have a strong direct interaction with two W particles and with two Z particles. And this in turn automatically leads to three large effects:
- the decays H –> WW and H –> ZZ,
- the production process quark + antiquark –> W H and quark + antiquark –> Z H, and
- the production process quark + quark –> quark + quark + H.
So if we can observe and measure any of these processes for a new particle, that will prove that the new particle is a Higgs particle.
Why must this interaction be present? Schematically, it is because of how the Higgs mechanism works: before the Higgs field has a non-zero value, the world allows an interaction of the form H H W W and H H Z Z, where “H” here is the Higgs field when its average value is zero, and “W” and “Z” are the W and Z fields with corresponding massless W and Z particles. This interaction allows two Higgs particles to collide to make two W particles, but there is no interaction involving just one Higgs particle at a time. But once the Higgs field has a non-zero average constant value throughout all of space and time (usually called “v”, historically) then we are motivated to write H = v + H, where H represents any difference of H from its average value v. When we do this we find
H H Z Z –> v2 Z Z + 2 v H Z Z + H H Z Z
The first term, in red, (a constant times Z Z) shifts the energy of a Z particle that is at rest — in other words, it provides what we call mass-energy! That’s the Higgs mechanism! This is how the Higgs field gives the Z particle a mass. [I promise to explain this point better elsewhere, but not now.]
The second term, in blue, allows a single Higgs particle to turn into two Z particles (or a Z particle and a Z virtual particle, as described here.) It also allows a virtual Z particle to turn into a Z particle and a Higgs particle (as described here) or for two virtual Z particles to turn into a Higgs particle (also described here.) Those are the three processes mentioned in the bullet points above.
The third term, in green, is similar to what we had when v was zero, and plays almost no role in the near term at the LHC.
[Actually, indirect interactions, such as the one that leads to Higgs –> two photons, can also cause Higgs –>WW. But an indirect interaction is much weaker than the direct interaction that I’ve just described. A non-Higgs particle would decay to W particles with about the same rate as it decays to two photons, or less, while a Higgs particle is much more likely to decay to two W particles than it is to decay to two photons.]
Observations that can detect the decays H–>WW and H –>ZZ and verify they are consistent with a pretty strong Higgs-W-W and Higgs-Z-Z interaction will occur very soon after the discovery of the Higgs particle, or maybe even during the discovery, especially if the Higgs particle is heavier than about 125 GeV. The decay H –> W W can be observed right away (though there may be controversy about it, because of the difficulties mentioned above about convincing oneself that one understands all the uncertainties in this measurement). Sometime later, Around the same time [the LHC experiments improved their methods!] H –> Z Z will be observed (and this will be clean and convincing, but may require a lot of data, especially for a lighter Higgs.) Convincing observation of either of these decays will be incontrovertible evidence that the particle that we have discovered is a Higgs particle of some type, a ripple in a Higgs field.
Once we are sure we have a Higgs particle, there is a whole set of questions to be asked about the strength of the interactions of this particle with the other known particles. Let’s call the strength of the interaction of the Higgs with W particles gW, the strength of its interaction with bottom quarks gb, etc. We want to know whether the Higgs behaves exactly like the Standard Model Higgs
- as far as the W and Z particles are concerned
- as far as top quarks are concerned
- as far as bottom quarks are concerned
- as far as tau leptons are concerned
These boil down to the questions of whether gW, gZ, gt, gb, gτ are the numbers predicted in the Standard Model, which in each case are the particle’s mass divided by v (up to a square root of 2.) We’d ask more questions about even lighter particles if we could, but the answers are probably out of reach at the LHC.
Also, we will want to know whether there any signs of new particles participating in the indirect interactions between the Higgs and gluons and/or beween the Higgs and photons. In other words, are the interaction strengths gg and gγ (γ standing for photon) what the Standard Model would predict? Let’s also recall that in the Standard Model gg is related to gt, and gγ is related to gt and gW, through the indirect effects that generate gg and gγ in the first place.
Unfortunately, it’s not so easy to answer these questions right away, for the following reason. What we measure is always a production rate for a particular process multiplied by the probability for a particular decay; the production rate is related to the strength of the Higgs interaction with one class of particles, while the decay rate is related to the strength of the Higgs interaction with a second (possibly the same) class of particles.
So for example, when we measure the rate for g g –> H –> γγ, we are measuring something proportional to (gg gγ)2. Similarly the rate for g g –> H –> WW is proportional to (gg gW)2. There’s no way to easily measure the individual interaction strengths separately.
But that said, if the rates for gg –> H –> γγ, g g –> H –> W W and gg –> H –> Z Z come out as predicted in the Standard Model (and we’ll already have a very rough idea of whether this is the case by the end of 2012, if the Higgs is found soon) then that will give us some confidence that none of the couplings gW, gZ, gγ, gg can be very different from what it is expected for the Standard Model Higgs particle — which in turn will give confidence that gt, which contributes to gγ and gg, is also what it is predicted for the Standard Model Higgs.
To convince ourselves that gb and gτ are as predicted in the Standard Model will take longer, and will require measuring things like
- quark + quark –> quark + quark + H, with H decaying to τ+τ–, and
- up quark +down antiquark –> W+ H , with W+ decaying to a charged anti-lepton and a neutrino, and the Higgs decaying to a bottom quark/anti-quark pair
If instead it turns out that some of these measurements differ from the predictions of the Standard Model, not only will this difference tell us that we’re dealing with a Higgs more complicated than the Standard Model Higgs, and that the Standard Model needs significant modification, but also the details of the differences may give us important clues as to what changes are needed. For instance, if the rates for g g –> H –> γγ, g g –> H –> W W and g g –> H –> Z Z are all in the predicted proportion, but are uniformly smaller from what is predicted in the Standard Model, that could suggest that there are two or more Higgs particles, and the field corresponding to the particle we are measuring is giving the known particles only a part of their masses. If instead the ratio of two-photon events to WW and ZZ events differs from the expectation in the Standard Model, then that would suggest that there are unknown particles contributing to the indirect interaction between photons and the Higgs particle. If the ratio of gb and gτ is not as predicted, that could either suggest that there is one Higgs field giving mass to the quarks and a different Higgs field giving mass to the leptons, or it could suggest that there are new particles which are indirectly affecting the interaction of Higgs particles with bottom quarks and/or tau leptons. Measuring gt using the fifth production process (gluon + gluon –> top quark + top anti-quark + Higgs) would help distinguish those two possibilities. In short, careful measurements of the interaction strengths of the Higgs particle with other particles will provide key tests of the Standard Model Higgs hypothesis, and provide considerable guidance should any of those tests fail.
In the end we won’t know for absolutely sure what’s going on at the LHC until we make many measurements not only of Higgs particles but of many other processes, including precision studies of top quarks and searches for many new types of particles that might or might not be present in nature. Meanwhile, other non-LHC experiments may also weigh in with important discoveries or tests of the Standard Model. In the end, we’ll have to combine all the voluminous information we have in order to reach the most complete conclusions possible. But testing whether any Higgs particle we find is of Standard Model-type will certainly be one of the most important things the LHC will be doing over the coming years.
16 Responses
Could you explain further what it is about when you wrote “due to an interesting and accidental cancellation it decays to electron-positron or muon-antimuon only 6% of the time” in your answer to Chris Austin?
And in the text you wrote that “I promise to explain this point better elsewhere, but not now” if that post was made could help me find it?
Thanks and regards
Since we’ve never observed a fundamental scalar before, presumably it will be important to check that the new particle is a spin 0 boson. I assume we can just look at the angular dependence of the Higgs decay products. What’s the best process to do that for a 125 GeV Higgs and how much data is needed?
Cheers, Robert
Good question. I am sure two photons is best for a direct measurement (ZZ to leptons/antileptons is too rare, WW too indistinct and too much background); how long it takes I don’t remember. (Of course you should remember also that we won’t *know* that it is a fundamental scalar! that’s an assumption that data will have to verify over time.) More convincing will be the decays to W’s and Z’s; if those are large and in the right ratio, it will be hard to argue this is not a Higgs particle of some type.
Actually, the H->WW->ll searches rely on the fact that the H is spin 0, to separate the signal from the SM WW background (which is a combination of spin 1/2 and 1 propagators). The spin 0 propagator causes the two leptons to be aligned with each other (in the plane transverse to the beam). So, if we see a signal in the WW->ll channel compatible with the Higgs production x decay rate to WW, this is already strong evidence that the propagator is spin 0.
Yes, excellent point! Thanks for the reminder… though this will require that we fully convince ourselves that we have good control of the WW measurement’s backgrounds. I guess we will have to see where the Higgs mass ends up, and once we know it, see how effectively we can work to reduce the systematic errors. I don’t know how bad it gets in the 115-130 GeV mass range.
Thanks a lot for all these explanations. Is there a simple explanation for why the branching ratio for H -> Z Z -> e+ e- e+ e-, 1 in 48,000 for a 125 GeV Higgs if I understand correctly what you wrote, is so small compared to the branching ratio for H -> W+ W- -> e+ nu_e e- nubar_e, 1 in 400 for a 125 GeV Higgs? Is it just the higher Z mass?
I think you might be missing a g in the second last paragraph, in g g -> H -> gamma gamma.
Thanks for catching the missin g.
Four effects explain your question.
1) The Z is heavier than the W, yes, so for a lightweight Higgs of order 120 GeV, its virtual particle is considerably harder to make than the W’s virtual particle.
2) The W has a stronger coupling to the Higgs, by a factor of 2 in the decay rate.
3) The W decays to an electron or muon and the corresponding anti-neutrino about 22% of the time (which is basically just 2/9), whereas the Z has much more complicated formulas for its interactions with other particles, and due to an interesting and accidental cancellation it decays to electron-positron or muon-antimuon only 6% of the time. That ratio of 3ish gets squared (since we have two W’s and two Z’s decaying) so that’s another factor of 10.
4) If there are one lepton and one antilepton in your event, your chance of detecting both of them is probably about 60%. Sometimes one of them has too low energy, or heads out too close to the beampipe, or happens to land inside a jet. When you have two leptons and two antileptons, that gets squared, and actually a little worse than that. So that’s another factor of 2 or so.
That said, I’m a little worried I overcounted something by a factor of 2 somewhere. Not sure right now, and too tired.