Tag Archives: atlas

Did the LHC Just Rule Out String Theory?!

Over the weekend, someone said to me, breathlessly, that they’d read that “Results from the Large Hadron Collider [LHC] have blown string theory out of the water.”

Good Heavens! I replied. Who fed you that line of rubbish?!

Well, I’m not sure how this silliness got started, but it’s completely wrong. Just in case some of you or your friends have heard the same thing, let me explain why it’s wrong.

First, a distinction — one that is rarely made, especially by the more rabid bloggers, both those who are string lovers and those that are string haters. [Both types mystify me.] String theory has several applications, and you need to keep them straight. Let me mention two.

  1. Application number 1: this is the one you’ve heard about. String theory is a candidate (and only a candidate) for a “theory of everything” — a silly term, if you ask me, for what it really means is “a theory of all of nature’s particles, forces and space-time”. It’s not a theory of genetics or a theory of cooking or a theory of how to write a good blog post. But it’s still a pretty cool thing. This is the theory (i.e. a set of consistent equations and methods that describes relativistic quantum strings) that’s supposed to explain quantum gravity and all of particle physics, and if it succeeded, that would be fantastic.
  2. Application number 2: String theory can serve as a tool. You can use its mathematics, and/or the physical insights that you can gain by thinking about and calculating how strings behave, to solve or partially solve problems in other subjects. (Here’s an example.) These subjects include quantum field theory and advanced mathematics, and if you work in these areas, you may really not care much about application number 1. Even if application number 1 were ruled out by data, we’d still continue to use string theory as a tool. Consider this: if you grew up learning that a hammer was a religious idol to be worshipped, and later you decided you didn’t believe that anymore, would you throw out all your hammers? No. They’re still useful even if you don’t worship them.

BUT: today we are talking about Application Number 1: string theory as a candidate theory of all particles, etc. Continue reading

A Discrepancy to Keep an Eye On

Today (as I sit in a waiting room for jury service) I’ll draw your attention to something that has been quite rare at the Large Hadron Collider [LHC]: a notable discrepancy between prediction and data.  (Too rare, in fact — when you make so many measurements, some of them should be discrepant; the one place we saw plenty of examples was in the search for and initial study of the Higgs particle.)  It’s not big enough to declare as a definite challenge to the Standard Model (the equations we use to describe the known particles and forces), but it’s one we’ll need to be watching… and you can bet there will be dozens of papers trying to suggest possibilities for what this discrepancy, if it is real, might be due to.

The discrepancy has arisen in the search at the CMS experiment for “multileptons”: for proton-proton collisions in which at least three charged leptons — electrons, muons and (to a degree) taus — were produced. Such events are a good place to look for new phenomena: very rare in the Standard Model, but in the context of some speculative ideas (including the possibility of additional types of Higgs particles, or of superpartner particles from supersymmetry, or new light neutral particles that decay sometimes to lepton/anti-lepton pairs, etc.) they can be produced in the decays of some unknown type of particle. Continue reading

Final Day of SEARCH 2013

Day 3 of the SEARCH workshop (see here for an introduction and overviews of Day 1 and Day 2) opened with my own talk, entitled “On The Frontier: Where New Physics May Be Hiding”. The issue I was addressing is this:

Even though dozens of different strategies have been used by the experimenters at ATLAS and CMS (the two general purpose experiments at the Large Hadron Collider [LHC]) to look for various types of new particles, there are still many questions that haven’t been asked and many aspects of the data that haven’t been studied. My goal was to point out a few of these unasked or incompletely asked questions, ones that I think are very important for ATLAS and CMS experts to investigate… both in the existing data and also in the data that the LHC will start producing, with a higher energy per proton-proton collision, in 2015.

I covered four topics — I’ll be a bit long-winded here, so just skip over this part if it bores you.

1. Non-Standard-Model (or “exotic”) Higgs Decays: a lightweight Higgs particle, such as the one we’ve recently discovered, is very sensitive to novel effects, and can reveal them by decaying in unexpected ways. One class of possibilities, studied by a very wide range of theorists over the past decade, is that the Higgs might decay to unknown lightweight particles (possibly related in some way to dark matter). I’ve written about these possible Higgs decays a lot (here, here, here, here, here, here and here). This was a big topic of mine at the last SEARCH workshop, and is related to the issue of data parking/delaying. In recent months, a bunch of young theorists (with some limited help and advice from me) have been working to write an overview article, going systematically through the most promising non-Standard-Model decay modes of the Higgs, and studying how easy or difficult it will be to measure them.  Discoveries using the 2011-2012 data are certainly possible!  and at least at CMS, the parked data is going to play an important role.

2. What Variants of “Natural” Supersymmetry (And Related Models) Are Still Allowed By ATLAS and CMS Searches? A natural variant of supersymmetry (see my discussion of “naturalness”=genericity here) is one in which the Higgs particle’s mass and the Higgs field’s value (and therefore the W and Z particles’ masses) wouldn’t change drastically if you were somehow to vary the masses of superpartner particles by small amounts. Such variants tend to have the superpartner particle of the Higgs (called the “Higgsino”) relatively light (a few hundred GeV/c² or below), the superpartner of the top (the “top squark”, with which the Higgs interacts very strongly) also relatively light, and the superpartner of the gluino up in the 1-2 TeV range. If the gluino is heavier than 1.4 TeV or so, then it is too heavy to have been produced during the 2011-2012 LHC run; for variants with such a heavy gluino, we may have to wait until 2015 and beyond to discover or rule them out. But it turns out that if the gluino is light enough (generally a bit above 1 TeV/c²) it is possible to make very general arguments, without resort to the three assumptions that go into the most classic searches for supersymmetry, that almost all such natural and currently accessible variants are now ruled out. I say “almost” because there is at least one class of important exceptions where the case is clearly not yet closed, and for which the gluino mass could be well below 1 TeV/c². [Research to completely characterize the situation is still in progress; I'm working on it with Rutgers faculty member David Shih and postdocs Yevgeny Kats and Jared Evans.]  What we’ve learned is applicable beyond supersymmetry to certain other classes of speculative ideas.

3. Long-Lived Particles: In most LHC studies, it is assumed that any currently unknown particles that are produced in LHC collisions will decay in microscopic times to particles we know about. But it is also possible that one or more new type of particle will decay only after traveling a measurable distance (about 1 millimeter or greater) from the collision point. Searching for such “long-lived” particles (with lifetimes longer than a trillionth of a second!) is complicated; there are many cases to consider, a non-standard search strategy is almost always required, and sometimes specialized trigger strategies are needed. Until recently, only a few studies had been carried out, many with only 2011 data. A very important advance occurred very recently, however, when CMS produced a study, using the full 2011-2012 data set, looking for a long-lived particle that decays to two jets (or to anything that looks to the detector like two jets, which is a bit more general) after traveling up to a large fraction of a meter. The specialized trigger that was used requires about 300 GeV of energy or more to be produced in the proton-proton collision in the form of jets (or things that look like jets to the triggering system.) This is too much for the search to detect a Higgs particle decaying to one or two long-lived particles, because a Higgs particle’s mass-energy [E=mc2 energy] is only 125 GeV, and it is rather rare therefore for 300 GeV of energy in jets-et-al to be observed when a Higgs is produced. But in many speculative theories with long-lived particles, this amount of energy is easily obtained. As a result, this new CMS search clearly wipes out, at one stroke, many variants of a number of speculative models. It will take theorists a little while to fully understand the impact of this new search, but it will be big. Still, it’s by no means the final word.  We need to push harder, improving and broadening the use of these methods, in order that decays of the Higgs itself to long-lived particles can be searched for. This has been done already in a handful of cases (for example if the long-lived particle decays not to jets but to a muon/anti-muon pair or an electron/positron pair, or if the long-lived particle travels several meters before it decays) and in some cases it is already possible to show that at most 1 in 100 to 1000 Higgs particles produce long-lived particles of this type.  For some other cases, the triggers developed for the parked data may be crucial.

4. “Soft” Signals: A frontier that has never been explored, but which theorists have been talking about for some years, is one in which a high-energy process associated with a new particle is typically accompanied by an unusually large number of very low-energy particles (typically photons or hadrons with energy below a few GeV). The high-energy process is mimicked by certain common processes that occur in the Standard Model, and consequently the signal is drowned out, like a child’s voice in a crowded room. But the haze of a large number of low-energy particles that accompanies the signal is rare in the mimicking processes, so by keeping only those collisions that show something like this haze, it becomes possible to throw out the mimicking process most of the time, making the signal stand out — as though, in trying to find the child, one could identify a way to get most of the people to leave the room, reducing the noise enough for the child’s voice to be heard. [For experts: The most classic example of this situation arises in certain types of objects called "quirks", though perhaps there are other examples. For non-experts: I'll explain what quirks are some other time; it's a sophisticated story.]

I was pleased that there was lively discussion on all of these four points; that’s essential for a good workshop.

After me there were talks by ATLAS expert Erez Etzion and CMS’s Steve Wurm, surveying a large number of searches for new particles and other phenomena by the two experiments. One new result that particularly caught my eye was a set of CMS searches for new very heavy particles that decay to pairs of W and/or Z particles.  The W and Z particles go flying outwards with tremendous energy, and form the kind of jet-like objects I mentioned yesterday in the context of Jesse Thaler’s talk on “jet substructure”.  This and a couple of other related measurements are reflective of our moving into a new era, in which detection of jet-like W and Z particles and jet-like top quarks has become part of the standard toolbox of a particle physicist.

The workshop concluded with three hour-long panel discussions:

  1. on the possible interplay between dark matter and LHC research (for instance: how production of “friends” of dark matter [i.e., particles that are somehow related to dark matter particles] may be easier to detect at the LHC than production of dark matter itself)
  2. on the highest priorities for the 2013-2014 shutdown period before the LHC restarts (for instance, conversations between theorists and experimentalists about the trigger strategies that should be used in the next LHC run)
  3. on what the opportunities of the 2015-2020 run of the LHC are likely to be, and what their implications may be (for instance, the ability to finally reach the 3 TeV/c2 mass range for the types of particles one would expect in the so-called “Randall-Sundrum” class of extra-dimensions models; the opportunities to look for very rare Higgs, top and W decays; and the potential to complete the program I outlined above of ruling out all but a very small class of natural variants of supersymmetry.)

All in all, a useful workshop — but its true value will depend on how much we all follow up on what we discussed.

SEARCH Day 2

Day 2 of the SEARCH workshop will get a shorter description than it deserves, because I’ve had to spend time finishing my own talk for this morning. But there were a lot of nice talks, so let me at least tell you what they were about.

Both ATLAS and CMS presented their latest results on searches for supersymmetry. (I should remind you that “searches for supersymmetry” are by no means actually limited to supersymmetry — they can be used to discover or exclude many other new particles and forces that have nothing to do with supersymmetry at all.) Speakers Pascal Pralavorio and Sanjay Padhi gave very useful overviews of the dozens of searches that have been done so far as part of this effort, including a few rather new results that are very powerful. (We should see even more appear at next week’s Supersymmetry conference.) My short summary: almost everything easy has been done thoroughly; many challenging searches have also been carried out; if superpartner particles are present, they’re either

  • so heavy that they aren’t produced very often (e.g. gluinos)
  • rather lightweight, but still not so often produced (e.g. top squarks, charginos, neutralinos, sleptons)
  • produced often, but decaying in some way that is very hard to detect (e.g. gluinos decaying only to quarks, anti-quarks and gluons)

Then we had a few talks by theorists. Patrick Meade talked about how unknown particles that are affected by weak nuclear and electromagnetic forces, but not by strong nuclear forces, could give signs that are hiding underneath processes that occur in the Standard Model. (Examples of such particles are the neutralinos and charginos or sleptons of supersymmetry.) To find them requires increased precision in our calculations and in our measurements of processes where pairs of W and/or Z and/or Higgs particles are produced. As a definite example, Meade noted that the rate for producing pairs of W particles disagrees somewhat from current predictions based on the Standard Model, and emphasized that this small disagreement could be due to new particles (such as top squarks, or sleptons, or charginos and neutralinos) although at this point there’s no way to know.

Matt Reece gave an analogous talk about spin-zero quark-like particles that do feel strong nuclear forces, the classic example of which are top squarks. Again, the presence of these particles can be hidden underneath the large signals from production of top quark/anti-quark pairs, or other common processes. ATLAS and CMS have been working hard to look for signals of these types of particles, and have made a lot of progress, but there are still quite a few possible signals that haven’t been searched for yet. Among other things, Reece discussed some methods invented by theorists that might be useful in contributing to this effort. As with the previous talk, the key to a complete search will be improvements in calculations and measurements of top quark production, and of other processes that involve known particles.

After lunch there was a more general discussion about looking for supersymmetry, including conversation about what variants of supersymmetry haven’t yet been excluded by existing ATLAS and CMS searches.  (I had a few things to say about that in my talk, but more on that tomorrow.)

Jesse Thaler gave a talk reviewing the enormous progress that has been made in understanding how to distinguish ordinary jets arising from quarks and gluons versus jet-like objects made from a single high-energy W, Z, Higgs or top quark that decays to quarks and anti-quarks. (The jargon is that the trick is to use “jet substructure” — the fact that inside a jet-like W are two sub-jets, each from a quark or anti-quark.) At SEARCH 2012, the experimenters showed very promising though preliminary results using a number of new jet substructure methods that had been invented by (mostly) theorists. By now, the experimenters have shown definitively that these methods work — and will continue to work as the rate of collisions at the LHC grows — and have made a number of novel measurements using them. Learning how to use jet substructure is one of the great success stories of the LHC era, and it will continue to be a major story in coming years.

Two talks by ATLAS (Leandro Nisanti) and CMS (Matt Hearndon) followed, each with a long list of careful measurements of what the Standard Model is doing, mostly based so far only on the 2011 data set (and not yet including last year’s data). These measurements are crucially important for multiple reasons:

  • They provide important information which can serve as input to other measurements and searches.
  • They may reveal subtle problems with the Standard Model, due to indirect or small effects from unknown particles or forces.
  • Confirming that measurements of certain processes agree with theoretical predictions gives us confidence that those predictions can be used in other contexts, in particular in searches for unknown particles and forces.

Most, but not all, theoretical predictions for these careful measurements have worked well. Those that aren’t working so well are of course being watched and investigated carefully — but there aren’t any discrepancies large enough to get excited about yet (other than the top quark forward-backward asymmetry puzzle, which wasn’t discussed much today). In general, the Standard Model works beautifully — so far.

The day concluded with a panel discussion focused on these Standard Model measurements. Key questions discussed included: how do we use LHC data to understand the structure of the proton more precisely, and how in turn does that affect our searches for unknown phenomena? In particular, a major concern is the risk of circularity; that a phenomenon from an unknown type of particle could produce a subtle effect that we would fail to recognize for what it is, instead misinterpreting it as a small misunderstanding of proton structure, or as a small problem with a theoretical calculation. Such are the challenges of making increasingly precise measurements, and searching for increasingly rare phenomena, in the complicated environment of the LHC.

SEARCHing for New Particles on Long Island

Greetings from Stony Brook’s Simon’s Center, and the SEARCH 2013 workshop. (I reported on the SEARCH 2012 workshop here, here, here and here.) Over the next three days, a small group (about 50) of theoretical particle physicists and experimentalists from ATLAS and CMS (two of the experiments at the Large Hadron Collider [LHC]) will be discussing the latest results from the LHC, and brainstorming about what else should be done with the existing LHC data and with future data.

The workshop was organized by three theorists, Raman Sundrum, professor at Maryland (who has opened the day with a characteristically brilliant and inspirational talk about the status of the field and the purpose of the workshop), Patrick Meade, professor at Stony Brook, and Michele Papucci, soon-to-be professor at Michigan.

Of course we’ll be discussing the newly discovered Higgs particle — that discussion will occupy most of today — but we’ll be also looking at many other types of particles, forces and other phenomena that nature might be hiding from us, and how we would be able to uncover them if they exist. There’ve been many dozens of searches done at both ATLAS and CMS, but the experimentalists certainly haven’t had time to try everything plausible — and theorists haven’t yet thought of everything they might try. Workshops like this are aimed at making sure no stones are left unturned in the existing huge pile of data from 2011-2012, and also that we’re fully prepared to deal with the new data, from higher-energy proton-proton collisions, that will start pouring in starting in 2015.

A Second Higgs Particle?

It is well-known in science that if the title of a post or paper begins with a question, the answer is always “NO”, or at best, “probably not”.  Today we’re working with “very probably not”.

Yes, we’ve found one type of Higgs particle, but there might be two, three, or even more types of Higgs particles in nature.  Such particles might well be discovered eventually at the Large Hadron Collider [LHC] or at future experiments hardly dreamt of today.  But at present, there’s no evidence yet for a second Higgs particle.  And all the hullabaloo we’re hearing right now is about old news, reprocessed into new news, which actually wasn’t news anyway and really isn’t now either. Continue reading

Opening of LHCP Conference

Greetings from Barcelona, where the LHCP 2013 conference is underway. I wanted to mention a couple of the opening remarks made by CERN’s Sergio Bertolucci and Mirko Pojer, both of whom spoke about the near-term and medium-term future of the Large Hadron Collider [LHC]. Continue reading

Review of the Higgs-to-2-Photon Data

Since it’s been the main news story of the last week, perhaps it would be useful to do a quick summary of what the CMS and ATLAS experiments at the Large Hadron Collider [LHC] have been saying, over the past fifteen months, about their search for the process in which a Higgs particle is produced and decays to two photons.

Before we start, let me remind you that in statements about how uncertain a measurement is (and all measurements have some level of uncertainty — no knowledge is perfect), a “σ”, or “sigma”, is a statistical quantity called a “standard deviation”; a 5σ discrepancy from expectations is impressive, 3σ intriguing; but anything less than 2σ is very typical, and indicative merely of the usual coming and going of statistical flukes and fluctuations of real data around the truth. Note also that the look-elsewhere effect has to be accounted for; but usually a 5σ discrepancy without the look-elsewhere effect is enough to be convincing. And of course a discrepancy may mean either a discovery or a mistake; that’s why it is important that two experiments, not just one, see a similar discrepancy, since it is unlikely that both experiments would make the same mistake.

Ok: here are the results as they came in over time, all the way back to the inconclusive hints of 15 months ago.

December 2011:

  • ATLAS (4.9 inverse fb of data at 7 TeV): excess 2.8σ (where 1.4σ would be expected for a SM Higgs); less than 2σ after accounting for “look-elsewhere effect”.
  • CMS: (4.8 inverse fb of data at 7 TeV): excess just over 2σ (where 1.4σ would be expected for a SM Higgs); much less than 2σ after accounting for “look-elsewhere effect”.

July 2012:

  • ATLAS: (reanalyzing the 7 TeV data and adding 5.9 inverse fb of data at 8 TeV): signal 4.5σ (where 2.4 was expected for a SM Higgs); 3.6σ after “look-elsewhere effect”; best estimate of size of signal divided by that for a SM Higgs: 1.9 ± 0.5 (about 1.8σ above the SM prediction)
  • CMS (reanalyzing the 7 TeV data and adding 5.3 inverse fb of data at 8 TeV): signal 4.1σ (where 2.5 was expected for a SM Higgs); 3.2σ after “look-elsewhere effect”; best estimate of size of signal divided by that for a SM Higgs: 1.6 ± 0.4 (about 1.5σ above the SM prediction)

November/December 2012:

  • ATLAS: (increasing the 8 TeV data to 13.0 inverse fb): signal 6.1σ (3.3 expected for SM Higgs); 5.4σ when look elsewhere is accounted for; best estimate of size of signal divided by that for a SM Higgs: 1.8 ± 0.4 (about 2σ above the SM prediction)
  • CMS: No update

March 2013:

  • ATLAS: (taking the full 7 and 8 TeV data sets): 7.4σ (4.1 expected for a SM Higgs); best estimate of size of signal divided by that for a SM Higgs: 1.65 ± 0.30 (slightly more than 2σ above the SM prediction)
  • CMS: (taking the full 7 and 8 TeV data sets) uses two different methods as a cross-check, one of them complex and (on average) more powerful, the other simpler but (on average) less powerful. For the best estimate of size of signal divided by that for a SM Higgs: one method gives 0.8 ± 0.3 and the other gives 1.1 ± 0.3. Both of these are within 1σ of the SM prediction and within 2σ of the CMS July result.

To understand how consistent the two new CMS results are with each other, you have to consider how the two studies are correlated (since they are selecting events for study from the same pile of data.)  Because the two methods select and discard candidate events in two different ways, they don’t include the exact same data.  CMS’s simulation studies indicate that about 50 percent of the background events and 80 percent of the signal events are common to the two studies. In the end, the conclusion (see the figure below) is that the two results are consistent at 1.5σ (and at 1.8 if one considers only the 8 TeV data) — in other words, reasonably consistent with one another.

You can also ask how consistent are the new results compared to the old ones from July. When you observe that the uncertainty on the July result was very large (1.6 ± 0.4 times the Standard Model prediction, i.e. a 25% uncertainty at 1σ, 50% uncertainty at 2σ) it should not surprise you that CMS claims that their new results are both consistent with the old ones at below the 2σ level.

Slide from Moriond-QCD conference talk presenting CMS's results, and looking at the compatibility of the two results with each other (top two lines in the table) and each of the two results with the previous published results.  Note the conclusion in the last line.

Slide from Moriond-QCD conference talk presenting CMS’s results, and looking at the compatibility of the two results with each other (top two lines in the table) and of each of the two new results with the previous published results. Note the conclusion in the last line.

Meanwhile, all of the ATLAS results are closely compatible with each other. This is more what one would naively expect, but not necessarily what actually happens in real data. Of course ATLAS’s results aren’t giving a consistent mass for the new particle yet, whereas CMS’s are doing so… well, this is what happens with real data, folks.

The real issue is whether ATLAS’s measurements and CMS’s measurements of the two photon rate are compatible with each other. Currently they are separated by at least 2σ and maybe as much as 3σ (a very rough estimate), which is not unheard of but is somewhat unusual. Well, whether the cause is an error or a statistical fluke or both, it unfortunately leaves us in a completely ambiguous situation. On the one hand, CMS’s results agree with the Standard Model prediction to within about 1σ. On the other hand, ATLAS’s results are in tension with the Standard Model prediction by a bit more than 2σ. We have no way to know which result is closer to the truth — especially when we recall that the uncertainty in the Standard Model prediction is itself about 20%. If ATLAS and CMS had both closely agreed with the Standard Model we’d be confident that any deviations from the Standard Model are too small to observe; if they both significantly disagreed in the same way, we’d be excited about the possibility that the Standard Model might be about to break down. But with the current results, we don’t know what to think.

So as far as the Higgs particle’s decays to two photons, we’ve gotten as much (or almost as much) information as we’re going to get for the moment; and we have no choice but to accept that the current situation is ambiguous and to wait for more data in 2015. Of course the Standard Model may break down sooner than 2015, for some other reason that the experimenters have yet to uncover in the 2011-2012 data. But the two-photon measurement won’t be the one to crack the armor of this amazing set of equations.  (For those who got all excited last July; you were warned that the uncertainties were very large and the excess might well be ephemeral.)