Of Particular Significance

So I got the following questions from a high school English teacher this morning, and I thought, for fun, I’d put the answers here for you to enjoy. Here (slightly abridged) is what the teacher wrote, and my answers:

I’ve turned my classroom into a video game to increase student engagement. In my gamified classroom, the villain is experimenting with/on the Higgs field. Your article on what would happen if the Higgs field was turned off answered a lot of my questions, but … I was hoping you could answer a couple of questions for me. I am sure these questions probably don’t have “real” answers, and are completely ridiculous, but I’d love to hear from you. (more…)

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON August 29, 2013

Today (as I sit in a waiting room for jury service) I’ll draw your attention to something that has been quite rare at the Large Hadron Collider [LHC]: a notable discrepancy between prediction and data.  (Too rare, in fact — when you make so many measurements, some of them should be discrepant; the one place we saw plenty of examples was in the search for and initial study of the Higgs particle.)  It’s not big enough to declare as a definite challenge to the Standard Model (the equations we use to describe the known particles and forces), but it’s one we’ll need to be watching… and you can bet there will be dozens of papers trying to suggest possibilities for what this discrepancy, if it is real, might be due to.

The discrepancy has arisen in the search at the CMS experiment for “multileptons”: for proton-proton collisions in which at least three charged leptons — electrons, muons and (to a degree) taus — were produced. Such events are a good place to look for new phenomena: very rare in the Standard Model, but in the context of some speculative ideas (including the possibility of additional types of Higgs particles, or of superpartner particles from supersymmetry, or new light neutral particles that decay sometimes to lepton/anti-lepton pairs, etc.) they can be produced in the decays of some unknown type of particle. (more…)

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON August 28, 2013

Arguably the two greatest problems facing particle physicists, cosmologists, string theorists, and the like are both associated with an apparent failure of a notion called “naturalness”.  Until now, I’ve mostly avoided this term on this site, because to utter the word demands an extended explanation.  After all, how could nature be unnatural, by definition?

Well, the answer is that the word “natural” has multiple meanings.  The one that scientists are using in this context isn’t “having to do with nature” but rather “typical” or “as expected” or “generic”, as in, “naturally the baby started screaming when she bumped her head”, or “naturally it costs more to live near the city center”, or “I hadn’t worn those glasses in months, so naturally they were dusty.”  And unnatural is when the baby doesn’t scream, when the city center is cheap, and when the glasses are pristine. Usually, when something unnatural happens, there’s a good reason.

I’ve started writing an article about naturalness and unnaturalness, and how there are two great mysteries about how unnatural our universe is, one of which lies at the heart of the Large Hadron Collider‘s [LHC’s] research program.  What I’ve written so far explains what naturalness means and (in part) how it applies to the Standard Model (the equations we use to describe the known elementary particles and forces).  I’ll be extending the article to explain this in more detail, and to explain the scientific argument as to why it is so unnatural to have a Higgs particle that is “lonely” — with no other associated particles (beyond the ones we already know) of roughly similar mass.  This in turn is why so many particle physicists have long expected the LHC to discover more than just a single Higgs particle and nothing else… more than just the Standard Model’s one and only missing piece… and why it will be a profound discovery with far-reaching implications if, during the next five years or so, the LHC experts sweep the floor clean and find nothing more in the LHC’s data than the Higgs particle that was found in 2012.

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON August 27, 2013

Day 3 of the SEARCH workshop (see here for an introduction and overviews of Day 1 and Day 2) opened with my own talk, entitled “On The Frontier: Where New Physics May Be Hiding”. The issue I was addressing is this:

Even though dozens of different strategies have been used by the experimenters at ATLAS and CMS (the two general purpose experiments at the Large Hadron Collider [LHC]) to look for various types of new particles, there are still many questions that haven’t been asked and many aspects of the data that haven’t been studied. My goal was to point out a few of these unasked or incompletely asked questions, ones that I think are very important for ATLAS and CMS experts to investigate… both in the existing data and also in the data that the LHC will start producing, with a higher energy per proton-proton collision, in 2015.

I covered four topics — I’ll be a bit long-winded here, so just skip over this part if it bores you.

1. Non-Standard-Model (or “exotic”) Higgs Decays: a lightweight Higgs particle, such as the one we’ve recently discovered, is very sensitive to novel effects, and can reveal them by decaying in unexpected ways. One class of possibilities, studied by a very wide range of theorists over the past decade, is that the Higgs might decay to unknown lightweight particles (possibly related in some way to dark matter). I’ve written about these possible Higgs decays a lot (here, here, here, here, here, here and here). This was a big topic of mine at the last SEARCH workshop, and is related to the issue of data parking/delaying. In recent months, a bunch of young theorists (with some limited help and advice from me) have been working to write an overview article, going systematically through the most promising non-Standard-Model decay modes of the Higgs, and studying how easy or difficult it will be to measure them.  Discoveries using the 2011-2012 data are certainly possible!  and at least at CMS, the parked data is going to play an important role.

2. What Variants of “Natural” Supersymmetry (And Related Models) Are Still Allowed By ATLAS and CMS Searches? A natural variant of supersymmetry (see my discussion of “naturalness”=genericity here) is one in which the Higgs particle’s mass and the Higgs field’s value (and therefore the W and Z particles’ masses) wouldn’t change drastically if you were somehow to vary the masses of superpartner particles by small amounts. Such variants tend to have the superpartner particle of the Higgs (called the “Higgsino”) relatively light (a few hundred GeV/c² or below), the superpartner of the top (the “top squark”, with which the Higgs interacts very strongly) also relatively light, and the superpartner of the gluino up in the 1-2 TeV range. If the gluino is heavier than 1.4 TeV or so, then it is too heavy to have been produced during the 2011-2012 LHC run; for variants with such a heavy gluino, we may have to wait until 2015 and beyond to discover or rule them out. But it turns out that if the gluino is light enough (generally a bit above 1 TeV/c²) it is possible to make very general arguments, without resort to the three assumptions that go into the most classic searches for supersymmetry, that almost all such natural and currently accessible variants are now ruled out. I say “almost” because there is at least one class of important exceptions where the case is clearly not yet closed, and for which the gluino mass could be well below 1 TeV/c². [Research to completely characterize the situation is still in progress; I’m working on it with Rutgers faculty member David Shih and postdocs Yevgeny Kats and Jared Evans.]  What we’ve learned is applicable beyond supersymmetry to certain other classes of speculative ideas.

3. Long-Lived Particles: In most LHC studies, it is assumed that any currently unknown particles that are produced in LHC collisions will decay in microscopic times to particles we know about. But it is also possible that one or more new type of particle will decay only after traveling a measurable distance (about 1 millimeter or greater) from the collision point. Searching for such “long-lived” particles (with lifetimes longer than a trillionth of a second!) is complicated; there are many cases to consider, a non-standard search strategy is almost always required, and sometimes specialized trigger strategies are needed. Until recently, only a few studies had been carried out, many with only 2011 data. A very important advance occurred very recently, however, when CMS produced a study, using the full 2011-2012 data set, looking for a long-lived particle that decays to two jets (or to anything that looks to the detector like two jets, which is a bit more general) after traveling up to a large fraction of a meter. The specialized trigger that was used requires about 300 GeV of energy or more to be produced in the proton-proton collision in the form of jets (or things that look like jets to the triggering system.) This is too much for the search to detect a Higgs particle decaying to one or two long-lived particles, because a Higgs particle’s mass-energy [E=mc2 energy] is only 125 GeV, and it is rather rare therefore for 300 GeV of energy in jets-et-al to be observed when a Higgs is produced. But in many speculative theories with long-lived particles, this amount of energy is easily obtained. As a result, this new CMS search clearly wipes out, at one stroke, many variants of a number of speculative models. It will take theorists a little while to fully understand the impact of this new search, but it will be big. Still, it’s by no means the final word.  We need to push harder, improving and broadening the use of these methods, in order that decays of the Higgs itself to long-lived particles can be searched for. This has been done already in a handful of cases (for example if the long-lived particle decays not to jets but to a muon/anti-muon pair or an electron/positron pair, or if the long-lived particle travels several meters before it decays) and in some cases it is already possible to show that at most 1 in 100 to 1000 Higgs particles produce long-lived particles of this type.  For some other cases, the triggers developed for the parked data may be crucial.

4. “Soft” Signals: A frontier that has never been explored, but which theorists have been talking about for some years, is one in which a high-energy process associated with a new particle is typically accompanied by an unusually large number of very low-energy particles (typically photons or hadrons with energy below a few GeV). The high-energy process is mimicked by certain common processes that occur in the Standard Model, and consequently the signal is drowned out, like a child’s voice in a crowded room. But the haze of a large number of low-energy particles that accompanies the signal is rare in the mimicking processes, so by keeping only those collisions that show something like this haze, it becomes possible to throw out the mimicking process most of the time, making the signal stand out — as though, in trying to find the child, one could identify a way to get most of the people to leave the room, reducing the noise enough for the child’s voice to be heard. [For experts: The most classic example of this situation arises in certain types of objects called “quirks”, though perhaps there are other examples. For non-experts: I’ll explain what quirks are some other time; it’s a sophisticated story.]

I was pleased that there was lively discussion on all of these four points; that’s essential for a good workshop.

After me there were talks by ATLAS expert Erez Etzion and CMS’s Steve Wurm, surveying a large number of searches for new particles and other phenomena by the two experiments. One new result that particularly caught my eye was a set of CMS searches for new very heavy particles that decay to pairs of W and/or Z particles.  The W and Z particles go flying outwards with tremendous energy, and form the kind of jet-like objects I mentioned yesterday in the context of Jesse Thaler’s talk on “jet substructure”.  This and a couple of other related measurements are reflective of our moving into a new era, in which detection of jet-like W and Z particles and jet-like top quarks has become part of the standard toolbox of a particle physicist.

The workshop concluded with three hour-long panel discussions:

  1. on the possible interplay between dark matter and LHC research (for instance: how production of “friends” of dark matter [i.e., particles that are somehow related to dark matter particles] may be easier to detect at the LHC than production of dark matter itself)
  2. on the highest priorities for the 2013-2014 shutdown period before the LHC restarts (for instance, conversations between theorists and experimentalists about the trigger strategies that should be used in the next LHC run)
  3. on what the opportunities of the 2015-2020 run of the LHC are likely to be, and what their implications may be (for instance, the ability to finally reach the 3 TeV/c2 mass range for the types of particles one would expect in the so-called “Randall-Sundrum” class of extra-dimensions models; the opportunities to look for very rare Higgs, top and W decays; and the potential to complete the program I outlined above of ruling out all but a very small class of natural variants of supersymmetry.)

All in all, a useful workshop — but its true value will depend on how much we all follow up on what we discussed.

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON August 23, 2013

Day 2 of the SEARCH workshop will get a shorter description than it deserves, because I’ve had to spend time finishing my own talk for this morning. But there were a lot of nice talks, so let me at least tell you what they were about.

Both ATLAS and CMS presented their latest results on searches for supersymmetry. (I should remind you that “searches for supersymmetry” are by no means actually limited to supersymmetry — they can be used to discover or exclude many other new particles and forces that have nothing to do with supersymmetry at all.) Speakers Pascal Pralavorio and Sanjay Padhi gave very useful overviews of the dozens of searches that have been done so far as part of this effort, including a few rather new results that are very powerful. (We should see even more appear at next week’s Supersymmetry conference.) My short summary: almost everything easy has been done thoroughly; many challenging searches have also been carried out; if superpartner particles are present, they’re either

  • so heavy that they aren’t produced very often (e.g. gluinos)
  • rather lightweight, but still not so often produced (e.g. top squarks, charginos, neutralinos, sleptons)
  • produced often, but decaying in some way that is very hard to detect (e.g. gluinos decaying only to quarks, anti-quarks and gluons)

Then we had a few talks by theorists. Patrick Meade talked about how unknown particles that are affected by weak nuclear and electromagnetic forces, but not by strong nuclear forces, could give signs that are hiding underneath processes that occur in the Standard Model. (Examples of such particles are the neutralinos and charginos or sleptons of supersymmetry.) To find them requires increased precision in our calculations and in our measurements of processes where pairs of W and/or Z and/or Higgs particles are produced. As a definite example, Meade noted that the rate for producing pairs of W particles disagrees somewhat from current predictions based on the Standard Model, and emphasized that this small disagreement could be due to new particles (such as top squarks, or sleptons, or charginos and neutralinos) although at this point there’s no way to know.

Matt Reece gave an analogous talk about spin-zero quark-like particles that do feel strong nuclear forces, the classic example of which are top squarks. Again, the presence of these particles can be hidden underneath the large signals from production of top quark/anti-quark pairs, or other common processes. ATLAS and CMS have been working hard to look for signals of these types of particles, and have made a lot of progress, but there are still quite a few possible signals that haven’t been searched for yet. Among other things, Reece discussed some methods invented by theorists that might be useful in contributing to this effort. As with the previous talk, the key to a complete search will be improvements in calculations and measurements of top quark production, and of other processes that involve known particles.

After lunch there was a more general discussion about looking for supersymmetry, including conversation about what variants of supersymmetry haven’t yet been excluded by existing ATLAS and CMS searches.  (I had a few things to say about that in my talk, but more on that tomorrow.)

Jesse Thaler gave a talk reviewing the enormous progress that has been made in understanding how to distinguish ordinary jets arising from quarks and gluons versus jet-like objects made from a single high-energy W, Z, Higgs or top quark that decays to quarks and anti-quarks. (The jargon is that the trick is to use “jet substructure” — the fact that inside a jet-like W are two sub-jets, each from a quark or anti-quark.) At SEARCH 2012, the experimenters showed very promising though preliminary results using a number of new jet substructure methods that had been invented by (mostly) theorists. By now, the experimenters have shown definitively that these methods work — and will continue to work as the rate of collisions at the LHC grows — and have made a number of novel measurements using them. Learning how to use jet substructure is one of the great success stories of the LHC era, and it will continue to be a major story in coming years.

Two talks by ATLAS (Leandro Nisanti) and CMS (Matt Hearndon) followed, each with a long list of careful measurements of what the Standard Model is doing, mostly based so far only on the 2011 data set (and not yet including last year’s data). These measurements are crucially important for multiple reasons:

  • They provide important information which can serve as input to other measurements and searches.
  • They may reveal subtle problems with the Standard Model, due to indirect or small effects from unknown particles or forces.
  • Confirming that measurements of certain processes agree with theoretical predictions gives us confidence that those predictions can be used in other contexts, in particular in searches for unknown particles and forces.

Most, but not all, theoretical predictions for these careful measurements have worked well. Those that aren’t working so well are of course being watched and investigated carefully — but there aren’t any discrepancies large enough to get excited about yet (other than the top quark forward-backward asymmetry puzzle, which wasn’t discussed much today). In general, the Standard Model works beautifully — so far.

The day concluded with a panel discussion focused on these Standard Model measurements. Key questions discussed included: how do we use LHC data to understand the structure of the proton more precisely, and how in turn does that affect our searches for unknown phenomena? In particular, a major concern is the risk of circularity; that a phenomenon from an unknown type of particle could produce a subtle effect that we would fail to recognize for what it is, instead misinterpreting it as a small misunderstanding of proton structure, or as a small problem with a theoretical calculation. Such are the challenges of making increasingly precise measurements, and searching for increasingly rare phenomena, in the complicated environment of the LHC.

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON August 22, 2013

The first day of the SEARCH workshop was focused on current and future measurements of the new Higgs particle discovered in 2012. A lot of the issues I’ve written about before (for instance here and here) and most of the updates were rather technical, so I won’t cover them today. But I thought it useful to take a look at what was said by Raman Sundrum and separately by Nima Arkani-Hamed, whom you’ve heard about many times (for instance, here and here), on the subject of the hierarchy problem and “naturalness”.

First, let me remind you of the issue. The hierarchy problem can be phrased in many ways. Here’s one. Here’s another: for a Standard Model Higgs (the simplest possible type of Higgs particle) to show up, without any other new particles or forces at the Large Hadron Collider, is … well, let’s say it’s completely shocking, with a caveat. Why?

  • Because every spin-zero particle (or particle-like object) that has ever been observed, in particle physics and in similar contexts within solids and fluids, has been accompanied by new phenomena at an energy scale comparable to the scalar’s mass-energy (E=mc2 energy).
  • And although we cannot calculate the mass of the Higgs particle using the Standard Model (the equations we use to predict the behavior of the known particles and forces) — the Higgs particle’s mass is something we put in to the equations, which is why we didn’t know, before the LHC, what it would be — there are many speculative theories that go beyond the Standard Model where the Higgs particle’s mass can be computed, or at least estimated. And in all of these cases, the Higgs particle is accompanied by other particles and forces that show up at scales comparable to the Higgs particle’s mass-energy.

This fact — that spin-zero particles like the Higgs are accompanied by other particles and forces at a similar energy range — isn’t a mystery. Particle physicists (and others who use quantum field theory, the type of math used in the Standard Model) understand why this should be true, and have for several decades. The jargon is that it is “natural” (not meaning “from nature”, but rather meaning “generically true”) for spin-zero particles to have other particles and forces around at comparable energy scales. (I’ll explain the argument another time.)

So to discover the Higgs particle at a mass-energy of 125 GeV, and no other new particles or phenomena below, say, 1000-2000 GeV or so, would fly in the face of what we’ve seen again and again in physics, both in past data and in calculations within speculative theories. In this sense, finding nothing except a Standard Model Higgs at the LHC would be shocking. (I say “would be” rather than “is” because the LHC is still young, and no overarching conclusions can yet be drawn from its current data.)

But — here’s the caveat — how bad is this shock? After all, somewhat surprising things do happen in nature all the time. Only astonishingly, spectacularly surprising things are very rare. Yes, it would be a very big shock if new particles and forces associated with the Higgs have a mass-energy a trillion trillion times higher than that of the Higgs. But what if they’re just a few times higher than would be natural, let’s say at 10,000 GeV — which would be out of reach of the LHC? Maybe that is a small enough shock that we shouldn’t pay it much attention.  Unfortunately, this is a judgment call; there’s no sharp answer to this question.

Raman Sundrum and his three options.
Raman Sundrum and his three options.

As Sundrum put it, there are (crudely) three logically distinct possibilities for what lies ahead:

  • No shock: The hierarchy problem is resolved naturally; the associated new particles will soon be seen at the LHC.
  • Mild shock: The hierarchy problem is resolved in a roughly natural way; most of the associated new particles will be a bit beyond the reach of the LHC, but perhaps one or more will be lightweight enough to be discovered during the lifetime of the LHC.
  • Severe shock: The hierarchy problem is not resolved naturally; any associated particles may lie far out of reach, though of course other particles (associated, say, with dark matter) might still show up at the LHC.

Arkani-Hamed made a similar distinction, but addressed the third case in more detail, breaking it up into two sub-cases.

  • The solution to the hierarchy problem is that it results from a bias (= selection effect = a form of the “anthropic principle”) ; the universe is huge, complex and diverse, with particles and forces that differ from place to place [sometimes called a “multiverse”], and most of that universe is inhospitable to life of any sort; the reason we live in an unusual part of that universe, with a lightweight unaccompanied scalar particle, is because this happens to be the only place (or one of very few places) that life could have evolved. A key test of this argument is to show that if the particles and forces of nature were much different from what we find them in our part of the universe, then our environment would become completely inhospitable — perhaps there would be no atoms, or no stars. It is controversial whether this test has been passed; good arguments can be made on both sides.
  • The solution to the hierarchy problem involves a completely novel mechanism.   Easy to say — but got any ideas?  Arkani-Hamed gave us two examples of mechanisms which he had studied that he couldn’t make work — but perhaps someone else can do better.  One is based on trying to apply notions related to self-organized criticality, but he was never able to make much progress.  Another is based on an idea of Ed Witten’s that perhaps our world is best understood as one that
    1. has two dimensions of space (not the obvious three)
    2. is supersymmetric (which seems impossible, but in three dimensions supersymmetry and gravity together imply that particles and their superpartner particles need not have equal masses)
    3. has extremely strong forces

    All of this seems completely contradictory with what we observe in our world. But! One of the important conceptual lessons from string theory [this is yet another example of something important that would not have been learned if people hadn’t actually been studying string theory] is that when forces become very strong, making the physics extremely complicated to describe, it is possible that a better description of that world becomes available — and that in some special cases, this better description has one additional dimension of space and weaker forces. In short, Witten’s idea is that our way of understanding our world, with three spatial dimensions, no apparent superymmetry and no extremely strong forces, might actually be simply an alternative and simpler description of a supersymmetric world with only two spatial dimensions with an extremely strong force. Arkani-Hamed, trying to apply this to the hierarchy problem, noticed this idea makes a prediction, but he showed that the prediction is false in the Standard Model, and it seems impossible to add any collection of particles that would make it true.

    Nima Arkani-Hamed, waving his hands.
    A well-dressed Nima Arkani-Hamed, waving his hands.
Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON August 21, 2013

Search

Buy The Book

Reading My Book?

Got a question? Ask it here.

Media Inquiries

For media inquiries, click here.