Tag Archives: LHC

Our Survey of Exotic Decays of the Higgs is Done

After many months gestation and a difficult labor, a behemoth is born!  Yes, it’s done, finally: our 200 page tome entitled “Exotic Decays of the 125 GeV Higgs Boson“.  Written by thirteen hard-working theoretical particle physicists, this is a paper that examines a wide class of possible decays that our newly found Higgs particle might exhibit, but that would not occur if the Standard Model of particle physics (the equations we use to describe the known elementary particles and forces plus the simplest possible type of Higgs particle) were all there was to see at the Large Hadron Collider [LHC], the giant proton-proton collider outside of Geneva, Switzerland.  

[Non-experts; sorry, but this paper was written for experts, and probably has a minimum of two words of jargon per sentence. I promise you a summary soon.]

Why is looking for unusual and unexpected decays of the Higgs particle so important?  [I've written about the possibility of these "exotic" decays before on this website (see herehere,  hereherehereherehere and here).]  Because Higgs particles are sensitive creatures, easily altered, possibly in subtle ways, by interactions with new types of particles that we wouldn’t yet know about from the LHC or our other experiments. (This sensitivity of the Higgs was noted as far back to the early 1980s, though its generality was perhaps only emphasized in the last decade.)  The Higgs particle is very interesting not only on its own, for what it might reveal about the Higgs field (on which our very existence depends), but also as a potential opportunity for the discovery of currently unknown, lightweight particles, to which it might decay.  Such particles might be the keys to unlocking secrets of nature, such as what dark matter is, or maybe even (extreme speculation alert) the naturalness puzzle — very roughly, the puzzle of why the mass of the Higgs particle can be so small compared to the masses of the smallest possible black holes.

The goal of our paper, which is extensive in its coverage (though still not comprehensive — this is a very big subject) is to help our experimental colleagues at ATLAS and CMS, the general purpose experiments at the LHC, decide what to search for in their current (2011-2012) and future (2015-) data, and perhaps assist in their decisions on triggering strategies for the data collecting run that will begin in 2015.  (Sorry, LHCb folks, we haven’t yet looked at decays where you’d have an advantage.) And we hope it will guide theorists too, by highlighting important unanswered questions about how to look for certain types of exotic decays.  Of course the paper has to go through peer review before it is published, but we hope it will be useful to our colleagues immediately. Time is short; 2015 is not very far away.

Although our paper contains some review of the literature, a number of its results are entirely new.  I’ll tell you more about them after I’ve recovered, and probably after most people are back from break in January.  (Maybe for now, as a teaser, I’ll just say that one of the strongest limits we obtained, as an estimate based on reinterpreting published ATLAS and CMS data, is that no more than a few × 10-4 of Higgs particles decay to a pair of neutral spin-one particles with mass in the 20 – 62 GeV/c2 range… and the experimentalists themselves, by re-analyzing their data, could surely do better than we did!)  But for the moment, I’d simply like to encourage my fellow experts, both from the theory side and the experimental side, to take a look… comments are welcome.

Finally, I’d like to congratulate and thank my young colleagues, all of whom are pre-tenure and several of whom are still not professors yet, on their excellent work… it has been a pleasure to collaborate with them.  They led the way, not me.  They are (in alphabetical order): David Curtin, Rouven Essig, Stefania Gori, Prerit Jaiswal, Andrey Katz, Tao Liu, Zhen Liu, David McKeen, Jessie Shelton, Ze’ev Surujon, Brock Tweedie, and Yi-Ming Zhong. They hail from around the world, but they’ve worked together like family… a great example of how our international effort to understand nature’s deep mysteries brings unity of purpose from a diversity of origins.

Visiting the University of Maryland

Along with two senior postdocs (Andrey Katz of Harvard and Nathaniel Craig of Rutgers) I’ve been visiting the University of Maryland all week, taking advantage of end-of-academic-term slowdowns to spend a few days just thinking hard, with some very bright and creative colleagues, about the implications of what we have discovered (a Higgs particle of mass 125-126 GeV/c²) and have not discovered (any other new particles or unexpected high-energy phenomena) so far at the Large Hadron Collider [LHC].

The basic questions that face us most squarely are:

Is the naturalness puzzle

  1. resolved by a clever mechanism that adds new particles and forces to the ones we know?
  2. resolved by properly interpreting the history of the universe?
  3. nonexistent due to our somehow misreading the lessons of quantum field theory?
  4. altered dramatically by modifying the rules of quantum field theory and gravity altogether?

If (1) is true, it’s possible that a clever new “mechanism” is required.  (Old mechanisms that remove or ameliorate the naturalness puzzle include supersymmetry, little Higgs, warped extra dimensions, etc.; all of these are still possible, but if one of them is right, it’s mildly surprising we’ve seen no sign of it yet.)  Since the Maryland faculty I’m talking to (Raman Sundrum, Zakaria Chacko and Kaustubh Agashe) have all been involved in inventing clever new mechanisms in the past (with names like Randall-Sundrum [i.e. warped extra dimensions], Twin Higgs, Folded Supersymmetry, and various forms of Composite Higgs), it’s a good place to be thinking about this possibility.  There’s good reason to focus on mechanisms that, unlike most of the known ones, do not lead to new particles that are affected by the strong nuclear force. (The Twin Higgs idea that Chacko invented with Hock-Seng Goh and Roni Harnik is an example.)  The particles predicted by such scenarios could easily have escaped notice so far, and be hiding in LHC data.

Sundrum (some days anyway) thinks the most likely situation is that, just by chance, the universe has turned out to be a little bit unnatural — not a lot, but enough that the solution to the naturalness puzzle may lie at higher energies outside LHC reach.  That would be unfortunate for particle physicists who are impatient to know the answer… unless we’re lucky and a remnant from that higher-energy phenomenon accidentally has ended up at low-energy, low enough that the LHC can reach it.

But perhaps we just haven’t been creative enough yet to guess the right mechanism, or alter the ones we know of to fit the bill… and perhaps the clues are already in the LHC’s data, waiting for us to ask the right question.

I view option (2) as deeply problematic.  On the one hand, there’s a good argument that the universe might be immense, far larger than the part we can see, with different regions having very different laws of particle physics — and that the part we live in might appear very “unnatural” just because that very same unnatural appearance is required for stars, planets, and life to exist.  To be over-simplistic: if, in the parts of the universe that have no Higgs particle with mass below 700 GeV/c², the physical consequences prevent complex molecules from forming, then it’s not surprising we live in a place with a Higgs particle below that mass.   [It's not so different from saying that the earth is a very unusual place from some points of view -- rocks near stars make up a very small fraction of the universe --- but that doesn't mean it's surprising that we find ourselves in such an unusual location, because a planet is one of the few places that life could evolve.]

Such an argument is compelling for the cosmological constant problem.  But it’s really hard to come up with an argument that a Higgs particle with a very low mass (and corresponding low non-zero masses for the other known particles) is required for life to exist.  Specifically, the mechanism of “technicolor” (in which the Higgs field is generated as a composite object through a new, strong force) seems to allow for a habitable universe, but with no naturalness puzzle — so why don’t we find ourselves in a part of the universe where it’s technicolor, not a Standard Model-like Higgs, that shows up at the LHC?  Sundrum, formerly a technicolor expert, has thought about this point (with David E. Kaplan), and he agrees this is a significant problem with option (2).

By the way, option (2) is sometimes called the “anthropic principle”.  But it’s neither a principle nor “anthro-” (human-) related… it’s simply a bias (not in the negative sense of the word, but simply in the sense of something that affects your view of a situation) from the fact that, heck, life can only evolve in places where life can evolve.

(3) is really hard for me to believe.  The naturalness argument boils down to this:

  • Quantum fields fluctuate;
  • Fluctuations carry energy, called “zero-point energy”, which can be calculated and is very large;
  • The energy of the fluctuations of a field depends on the corresponding particle’s mass;
  • The particle’s mass, for the known particles, depends on the Higgs field;
  • Therefore the energy of empty space depends strongly on the Higgs field

Unless one of these five statements is wrong (good luck finding a mistake — every one of them involves completely basic issues in quantum theory and in the Higgs mechanism for giving masses) then there’s a naturalness puzzle.  The solution may be simple from a certain point of view, but it won’t come from just waving the problem away.

(4) I’d love for this to be the real answer, and maybe it is.  If our understanding of quantum field theory and Einstein’s gravity leads us to a naturalness problem whose solution should presumably reveal itself at the LHC, and yet nature refuses to show us a solution, then maybe it’s a naive use of field theory and gravity that’s at fault. But it may take a very big leap of faith, and insight, to see how to jump off this cliff and yet land on one’s feet.  Sundrum is well-known as one of the most creative and fearless individuals in our field, especially when it comes to this kind of thing. I’ve been discussing some radical notions with him, but mostly I’ve been enjoying hearing his many past insights and ideas… and about the equations that go with them.   Anyone can speculate, but it’s the equations (and the predictions, testable at least in principle if not in practice, that you can derive from them) that transform pure speculations into something that deserves the name “theoretical physics”.

What’s the Status of the LHC Search for Supersymmetry?

It’s been quite a while (for good reason, as you’ll see) since I gave you a status update on the search for supersymmetry, one of several speculative ideas for what might lie beyond the known particles and forces.  Specifically, supersymmetry is one option (the most popular and most reviled, perhaps, but hardly the only one) for what might resolve the so-called “naturalness” puzzle, closely related to the “hierarchy problem” — Why is gravity so vastly weaker than the other forces? Why is the Higgs particle‘s mass so small compared to the mass of the lightest possible black hole?

Click here to read more about the current situation…

Wednesday: Sean Carroll & I Interviewed Again by Alan Boyle

Today, Wednesday December 4th, at 8 pm Eastern/5 pm Pacific time, Sean Carroll and I will be interviewed again by Alan Boyle on “Virtually Speaking Science”.   The link where you can listen in (in real time or at your leisure) is

http://www.blogtalkradio.com/virtually-speaking-science/2013/12/05/alan-boyle-matt-strassler-sean-carroll

What is “Virtually Speaking Science“?  It is an online radio program that presents, according to its website:

  • Informal conversations hosted by science writers Alan Boyle, Tom Levenson and Jennifer Ouellette, who explore the explore the often-volatile landscape of science, politics and policy, the history and economics of science, science deniers and its relationship to democracy, and the role of women in the sciences.

Sean Carroll is a Caltech physicist, astrophysicist, writer and speaker, blogger at Preposterous Universe, who recently completed an excellent and now prize-winning popular book (which I highly recommend) on the Higgs particle, entitled “The Particle at the End of the Universe“.  Our interviewer Alan Boyle is a noted science writer, author of the book “The Case for Pluto“, winner of many awards, and currently NBC News Digital’s science editor [at the blog  "Cosmic Log"].

Sean and I were interviewed in February by Alan on this program; here’s the link.  I was interviewed on Virtually Speaking Science once before, by Tom Levenson, about the Large Hadron Collider (here’s the link).  Also, my public talk “The Quest for the Higgs Particle” is posted in their website (here’s the link to the audio and to the slides).

Off to Illinois’s National Labs For a Week of Presentations

I have two very different presentations to give this week, on two very similar topics. First I’m going to the LHC Physics Center [LPC], located at the Fermilab National Accelerator Laboratory, host of the now-defunct Tevatron accelerator, the predecessor to the Large Hadron Collider [LHC]. The LPC is the local hub for the United States wing of the CMS experiment, one of the two general-purpose experiments at the LHC. [CMS, along with ATLAS, is where the Higgs particle was discovered.] The meeting I’m attending is about supersymmetry, although that’s just its title, really; many of the talks will have implications that go well beyond that specific subject, exploring more generally what we have and still could search for in the LHC’s existing and future data.  I’ll be giving a talk for experts on what we do and don’t know currently about one class of supersymmetry variants, and what we should be perhaps be trying to do next to cover cases that aren’t yet well-explored.

Second, I’ll be going to Argonne National Laboratory, to give a talk for the scientists there, most of whom are not particle physicists, about what we have learned so far about nature from the LHC’s current data, and what the big puzzles and challenges are for the future.  So that will be a talk for non-expert scientists, which requires a completely different approach.

Both presentations are more-or-less new and will require quite a bit of work on my part, so don’t be surprised if posts and replies to comments are a little short on details this week…

At a CMS/Theory Workshop in Princeton

For Non-Experts Who've Read a Bit About Particle Physics

I spent yesterday, and am spending today, at Princeton University, participating in a workshop that brings together a group of experts from the CMS experiment, one of the two general purpose experiments at the Large Hadron Collider (where the Higgs particle was discovered.) They’ve invited me, along with a few other theoretical physicists, to speak to them about additional strategies they might use in searching for phenomena that are not expected to occur within the Standard Model (the equations we use to describe the known elementary particles and forces.) This sort of “consulting” is one of the roles of theorists like me. It involves combining a broad knowledge of the surprises nature might have in store for us with a comprehensive understanding of what CMS and its competitor ATLAS (as well as other experiments at and outside the LHC) have and have not searched for already.

A lot of what I’ll have to say is related to what I said in Stony Brook at the SEARCH workshop, but updated, and with certain details adjusted to match the all-CMS audience.

Yesterday afternoon’s back-and-forth between the theorists and the experimentalists was focused on signals that are very hard to detect directly, such as (still hypothetical) dark matter particles. These could perhaps be produced in the LHC’s proton-proton collisions, but could then go undetected, because (like neutrinos) they pass without hitting anything inside of CMS. But even though we can’t detect these particles directly, we can sometimes tell indirectly that they’re present, if the collision simultaneously makes something else that recoils sharply away from them. That sometime else could be a photon (i.e. a particle of light) or a jet (the spray of particles that tells you that a high-energy gluon or quark was produced) or perhaps something else. There was a lot of interesting discussion about the various possible approaches to searching for such signals more effectively, and about how the trigger strategy might need to be adjusted in 2015, when the LHC starts taking data again at higher energy per collision, so that CMS remains maximally sensitive to their presence. Clearly there is much more work to do on this problem.

A Busy Week at CERN

A week at CERN, the laboratory that hosts the Large Hadron Collider [LHC] (where the Higgs particle was discovered), is always extremely packed, and this one was no exception. It’s very typical that on a given day I’ll have four or five intense one-on-one scientific meetings with colleagues, both theorists and experimenters, and will attend a couple of presentations on hot topics — or perhaps ten, if there’s a conference going on (which is almost always.) Work starts at 9 am and typically ends at 7 pm. And of course I have my own work to do — papers to finish, for instance — so after a break for dinner, I keep working til midnight. Squeezing in time for writing blog posts can be tough under these conditions! But at least it is for very good reasons.

Just this morning I’ve just attended two talks related to a future particle physics collider that people are starting to think seriously about… a collider (currently called T-LEP) that would be built in an 80 kilometer-long [50 mile-long] circular tunnel, and in which electrons and positrons [positron = anti-electron] would be smashed together.  The physics program of such a machine would be quite broad, including intensive studies of the four heaviest known particles in nature: the Z particle, the W particle, the Higgs particle and the top quark. Any one of them might reveal secrets when investigated in detail.  In fact, T-LEP’s extremely precise measurements, made in the 100-500 GeV = 0.1-0.5 TeV energy range, would be used to check the equations that explain how the Higgs field gives elementary particles their masses to one part in a thousand, and to potentially be indirectly sensitive to effects of unknown particles and forces all the way up to 10-30 TeV energy scales.

After that I had a typical meeting with an experimentalist at the CMS experiment, discussing the many ways that one might still make discoveries using the existing 2011-2012 LHC data. The big concern here is that the LHC experimenters are so busy getting ready for the 2015 run of the LHC that they may not fully exploit the data that they already have.

Off to more meetings…

Visiting the Host Lab of the Large Hadron Collider

Greetings from Geneva, and CERN, the laboratory that hosts the Large Hadron Collider [LHC], where the Higgs particle was found by the physicists at the ATLAS and CMS experiments. Between jet lag, preparing a talk for Wednesday, and talking to many experimental and theoretical particle physicists from morning til night, it will be a pretty exhausting week.

The initial purpose of this trip is to participate in a conference held by the LHCb experiment, entitled “Implications of LHCb measurements and future prospects.” Its goal is to bring theoretical particle physicists and LHCb experimenters together, to exchange information about what has been and what can be measured at LHCb.

On this website I’ve mostly written about ATLAS and CMS, partly because LHCb’s measurements are often quite subtle to explain, and partly because the Higgs particle search, the highlight of the early stage of the LHC, was really ATLAS’s and CMS’s task. But this week’s activities gives me a nice opportunity to put the focus on this very interesting experiment, which is quite different from ATLAS and CMS both in its design and in its goals, and to explain its important role.

ATLAS and CMS were built as general purpose detectors, whose first goal was to find the Higgs particle and whose second was to find (potentially rare) signs of any other high-energy processes that are not predicted by the Standard Model, the equations we use to describe all the known particles and forces of nature. Crudely speaking, ATLAS and CMS are ideal for looking for new phenomena in the 100 to 5000 GeV energy range (though we won’t reach the upper end of the range until 2015 and beyond.)

LHCb, by contrast, was built to study in great detail the bottom and charm quarks, and the hadrons (particles made from quarks, anti-quarks and gluons) that contain them. These quarks and their antiquarks are produced in enormous abundance at the LHC. They and the hadrons that contain them have masses in the 1.5 to 10 GeV/c² range… not much heavier than protons, and much lower than what ATLAS and CMS are geared to study. And this is why LHCb has been making crucial high-precision tests of the Standard Model using bottom- and charm-containing hadrons.  (Crucial, but not, despite repeated claims by the LHCb press office, capable of ruling out supersymmetry, which no single measurement can possibly do.)

Although this is the rough division of labor among these experiments, it’s too simplistic to describe the experiments this way. ATLAS and CMS can do quite a lot of physics at the low mass range, and in some measurements can compete well with LHCb.   Less well-known is that LHCb may be able to do a small but critical set of measurements involving higher energies than is their usual target.

LHCb is very different from ATLAS and CMS in many ways, and the most obvious is its shape. ATLAS and CMS look like giant barrels centered on the location of the proton-proton collisions, and are designed to measure as many particles as possible that are produced in the collision of two protons. LHCb’s shape is more like a wedge, with one end surrounding the collision point.

Left: Cut-away drawing of CMS, which is shaped like a barrel with proton-proton collisions occurring at its center.  ATLAS's shape is similar. Right: the LHCb experiment is shaped something like a wedge, with collisions occurring at one end.

Left: Cut-away drawing of CMS, which is shaped like a barrel with proton-proton collisions occurring at its center. ATLAS’s shape is similar. Right: Cut-away drawing of LHCb, which is shaped something like a wedge, with collisions occurring at one end.

This shape only allows it to measure those particle that go in the “forward” direction — close to the direction of one of the proton beams. (“Backward” would be near the other beam; the distinction between forward and backward is arbitrary, because the two proton beams have the same properties. “Central” would be far from either beam.) Unlike ATLAS and CMS, LHCb is not used to reconstruct the whole collision; many of the particles produced in the collision go into backward or central regions which LHCb can’t observe.  This has some disadvantages, and in particular put LHCb out of the running for the Higgs discovery. But a significant fraction of the bottom and charm quarks produced in proton-proton collisions go “forward” or “backward”, so a forward-looking design is fine if it’s bottom and charm quarks you’re interested in. And such a design is a lot cheaper, too. It also means that LHCb  is well positioned to make some other measurements where the forward direction is important. I’ll give you one or two examples later in the week.

To make their measurements of bottom and charm quarks, LHCb makes use of the fact that these quarks decay after about a trillionth of a second (a picosecond) [or longer if, as is commonly the case, there is significant time dilation due to Einstein's relativity effects on very fast particles].  This is long enough for them to travel a measurable distance — typically a millimeter or more. LHCb is designed to make the measurements of charged particles with terrific precision, allowing them to infer a slight difference between the proton-proton collision point, from which most low-energy charged particles will emerge, and the location where some other charged particles may have been produced in the decay of a bottom hadron or some other particle that travels a millimeter or more before decaying. The ability to do precision “tracking” of the charged particles makes LHCb sensitive to the presence of any as-yet unknown particles that might be produced and then decay after traveling a small or moderate distance. More on that later in the week.

A computer reconstruction of the tracks in a proton-proton collision measured by LHCb.  Most tracks start at the proton-proton collision point, but the two tracks drawn in purple emerge from a different point, the apparent location of the decay of a hadron containing a bottom quark.

A computer reconstruction of the tracks in a proton-proton collision, as measured by LHCb. Most tracks start at the proton-proton collision point at left, but the two tracks drawn in purple emerge from a different point about 15 millimeters away, the apparent location of the decay of a hadron, whose inferred trajectory is the blue line, and whose mass (measured from the purple tracks) indicates that it contained a bottom quark.

One other thing to know about LHCb; in order to make their precise measurements possible, and to deal with the fact that they don’t observe a whole collision, they can’t afford to have too many collisions going on at once. ATLAS and CMS have been coping with ten to twenty simultaneous proton-proton collisions; this is part of what is known as “pile-up”. But near LHCb the LHC beams are adjusted so that the number of collisions at LHCb is often limited to just one or two or three simultaneous collisions. This has the downside that the amount of data LHCb collected in 2011 was about 1/5 of what ATLAS and CMS each collected, while for 2012 the number was more like 1/10.  But LHCb can do a number of things to make up for this lower rate; in particular their trigger system is more forgiving than that of ATLAS or CMS, so there are certain things they can measure using data of a sort that ATLAS and CMS have no choice but to throw away.