Of Particular Significance

I’m still early on in my attempts to explain the “naturalness problem of the Standard Model” and its implications.  A couple of days ago I explained what particle physicists mean by the term “natural” — it means “typical” or “generic”.  And I described how, at least from one naive point of view, the Standard Model (the equations we use to describe the known elementary particles and forces) is unnatural.  Indeed any theory is unnatural that has a

  • a spin-zero particle (in the Standard Model, the newly discovered Higgs particle), which
  • is very lightweight in the following sense: it has a very very low mass-energy compared to the energy at which gravity becomes a strong force, and which
  • isn’t accompanied (in the Standard Model specifically) by other related particles that also have small masses.

But I didn’t actually explain any of this yet; I just described it.

Specifically, I didn’t start yet to explain what causes the Standard Model to be unnatural.  This is important to do, because, as many attentive readers naturally complained, my statements about the unnatural aspect of the Standard Model was based on a rather arbitrary-sounding statistical argument, and story-telling, which is hardly enough for scientific discussion.  Patience; I’ll get there, not today but probably the next installment after today’s.

To see why the argument I gave is actually legitimate (which doesn’t mean it is right, but if it’s wrong it’s not for a simple reason you’ll think of in five minutes), it is necessary to look in a little bit more detail at one of the most fundamental aspects of quantum field theory: quantum fluctuations, and the energy they carry.  So for today I have written an article about this, reasonably complete.

Be prepared — the article runs headlong into the only naturalness problem in particle physics that is worse than the naturalness problem of the Standard Model (the one I wrote about on Tuesday)!  I am referring to the “cosmological constant problem”.  In a nutshell:

  • we can calculate that, in any typical quantum field theory with gravity, the amount of energy in empty space (often called `dark energy’) should be huge, and we know of no way to avoid having it in a typical somewhat-realistic theory of the universe,
  • yet measurements of the cosmos — in fact, the very existence of a large and old universe — assure that, if Einstein’s theory of gravity is basically right, then instead of a huge amount of `dark energy’, there’s just a very small amount — not much more than the total amount of mass-energy [E=mc² energy] found in all the matter that’s scattered thinly throughout the universe.

After you’ve read about quantum fluctuations and the cosmological constant problem, and have a bit of a sense as to why it is so hard to make it go away, we can go back to the Standard Model, and try to understand the naturalness problem that is associated with the Higgs particle and field.  It all has to do with another aspect of quantum fluctuations — the fact that their energy depends on, and therefore helps determine, the average value of the Higgs field.

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON August 30, 2013

So I got the following questions from a high school English teacher this morning, and I thought, for fun, I’d put the answers here for you to enjoy. Here (slightly abridged) is what the teacher wrote, and my answers:

I’ve turned my classroom into a video game to increase student engagement. In my gamified classroom, the villain is experimenting with/on the Higgs field. Your article on what would happen if the Higgs field was turned off answered a lot of my questions, but … I was hoping you could answer a couple of questions for me. I am sure these questions probably don’t have “real” answers, and are completely ridiculous, but I’d love to hear from you. (more…)

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON August 29, 2013

Today (as I sit in a waiting room for jury service) I’ll draw your attention to something that has been quite rare at the Large Hadron Collider [LHC]: a notable discrepancy between prediction and data.  (Too rare, in fact — when you make so many measurements, some of them should be discrepant; the one place we saw plenty of examples was in the search for and initial study of the Higgs particle.)  It’s not big enough to declare as a definite challenge to the Standard Model (the equations we use to describe the known particles and forces), but it’s one we’ll need to be watching… and you can bet there will be dozens of papers trying to suggest possibilities for what this discrepancy, if it is real, might be due to.

The discrepancy has arisen in the search at the CMS experiment for “multileptons”: for proton-proton collisions in which at least three charged leptons — electrons, muons and (to a degree) taus — were produced. Such events are a good place to look for new phenomena: very rare in the Standard Model, but in the context of some speculative ideas (including the possibility of additional types of Higgs particles, or of superpartner particles from supersymmetry, or new light neutral particles that decay sometimes to lepton/anti-lepton pairs, etc.) they can be produced in the decays of some unknown type of particle. (more…)

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON August 28, 2013

Arguably the two greatest problems facing particle physicists, cosmologists, string theorists, and the like are both associated with an apparent failure of a notion called “naturalness”.  Until now, I’ve mostly avoided this term on this site, because to utter the word demands an extended explanation.  After all, how could nature be unnatural, by definition?

Well, the answer is that the word “natural” has multiple meanings.  The one that scientists are using in this context isn’t “having to do with nature” but rather “typical” or “as expected” or “generic”, as in, “naturally the baby started screaming when she bumped her head”, or “naturally it costs more to live near the city center”, or “I hadn’t worn those glasses in months, so naturally they were dusty.”  And unnatural is when the baby doesn’t scream, when the city center is cheap, and when the glasses are pristine. Usually, when something unnatural happens, there’s a good reason.

I’ve started writing an article about naturalness and unnaturalness, and how there are two great mysteries about how unnatural our universe is, one of which lies at the heart of the Large Hadron Collider‘s [LHC’s] research program.  What I’ve written so far explains what naturalness means and (in part) how it applies to the Standard Model (the equations we use to describe the known elementary particles and forces).  I’ll be extending the article to explain this in more detail, and to explain the scientific argument as to why it is so unnatural to have a Higgs particle that is “lonely” — with no other associated particles (beyond the ones we already know) of roughly similar mass.  This in turn is why so many particle physicists have long expected the LHC to discover more than just a single Higgs particle and nothing else… more than just the Standard Model’s one and only missing piece… and why it will be a profound discovery with far-reaching implications if, during the next five years or so, the LHC experts sweep the floor clean and find nothing more in the LHC’s data than the Higgs particle that was found in 2012.

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON August 27, 2013

Day 3 of the SEARCH workshop (see here for an introduction and overviews of Day 1 and Day 2) opened with my own talk, entitled “On The Frontier: Where New Physics May Be Hiding”. The issue I was addressing is this:

Even though dozens of different strategies have been used by the experimenters at ATLAS and CMS (the two general purpose experiments at the Large Hadron Collider [LHC]) to look for various types of new particles, there are still many questions that haven’t been asked and many aspects of the data that haven’t been studied. My goal was to point out a few of these unasked or incompletely asked questions, ones that I think are very important for ATLAS and CMS experts to investigate… both in the existing data and also in the data that the LHC will start producing, with a higher energy per proton-proton collision, in 2015.

I covered four topics — I’ll be a bit long-winded here, so just skip over this part if it bores you.

1. Non-Standard-Model (or “exotic”) Higgs Decays: a lightweight Higgs particle, such as the one we’ve recently discovered, is very sensitive to novel effects, and can reveal them by decaying in unexpected ways. One class of possibilities, studied by a very wide range of theorists over the past decade, is that the Higgs might decay to unknown lightweight particles (possibly related in some way to dark matter). I’ve written about these possible Higgs decays a lot (here, here, here, here, here, here and here). This was a big topic of mine at the last SEARCH workshop, and is related to the issue of data parking/delaying. In recent months, a bunch of young theorists (with some limited help and advice from me) have been working to write an overview article, going systematically through the most promising non-Standard-Model decay modes of the Higgs, and studying how easy or difficult it will be to measure them.  Discoveries using the 2011-2012 data are certainly possible!  and at least at CMS, the parked data is going to play an important role.

2. What Variants of “Natural” Supersymmetry (And Related Models) Are Still Allowed By ATLAS and CMS Searches? A natural variant of supersymmetry (see my discussion of “naturalness”=genericity here) is one in which the Higgs particle’s mass and the Higgs field’s value (and therefore the W and Z particles’ masses) wouldn’t change drastically if you were somehow to vary the masses of superpartner particles by small amounts. Such variants tend to have the superpartner particle of the Higgs (called the “Higgsino”) relatively light (a few hundred GeV/c² or below), the superpartner of the top (the “top squark”, with which the Higgs interacts very strongly) also relatively light, and the superpartner of the gluino up in the 1-2 TeV range. If the gluino is heavier than 1.4 TeV or so, then it is too heavy to have been produced during the 2011-2012 LHC run; for variants with such a heavy gluino, we may have to wait until 2015 and beyond to discover or rule them out. But it turns out that if the gluino is light enough (generally a bit above 1 TeV/c²) it is possible to make very general arguments, without resort to the three assumptions that go into the most classic searches for supersymmetry, that almost all such natural and currently accessible variants are now ruled out. I say “almost” because there is at least one class of important exceptions where the case is clearly not yet closed, and for which the gluino mass could be well below 1 TeV/c². [Research to completely characterize the situation is still in progress; I’m working on it with Rutgers faculty member David Shih and postdocs Yevgeny Kats and Jared Evans.]  What we’ve learned is applicable beyond supersymmetry to certain other classes of speculative ideas.

3. Long-Lived Particles: In most LHC studies, it is assumed that any currently unknown particles that are produced in LHC collisions will decay in microscopic times to particles we know about. But it is also possible that one or more new type of particle will decay only after traveling a measurable distance (about 1 millimeter or greater) from the collision point. Searching for such “long-lived” particles (with lifetimes longer than a trillionth of a second!) is complicated; there are many cases to consider, a non-standard search strategy is almost always required, and sometimes specialized trigger strategies are needed. Until recently, only a few studies had been carried out, many with only 2011 data. A very important advance occurred very recently, however, when CMS produced a study, using the full 2011-2012 data set, looking for a long-lived particle that decays to two jets (or to anything that looks to the detector like two jets, which is a bit more general) after traveling up to a large fraction of a meter. The specialized trigger that was used requires about 300 GeV of energy or more to be produced in the proton-proton collision in the form of jets (or things that look like jets to the triggering system.) This is too much for the search to detect a Higgs particle decaying to one or two long-lived particles, because a Higgs particle’s mass-energy [E=mc2 energy] is only 125 GeV, and it is rather rare therefore for 300 GeV of energy in jets-et-al to be observed when a Higgs is produced. But in many speculative theories with long-lived particles, this amount of energy is easily obtained. As a result, this new CMS search clearly wipes out, at one stroke, many variants of a number of speculative models. It will take theorists a little while to fully understand the impact of this new search, but it will be big. Still, it’s by no means the final word.  We need to push harder, improving and broadening the use of these methods, in order that decays of the Higgs itself to long-lived particles can be searched for. This has been done already in a handful of cases (for example if the long-lived particle decays not to jets but to a muon/anti-muon pair or an electron/positron pair, or if the long-lived particle travels several meters before it decays) and in some cases it is already possible to show that at most 1 in 100 to 1000 Higgs particles produce long-lived particles of this type.  For some other cases, the triggers developed for the parked data may be crucial.

4. “Soft” Signals: A frontier that has never been explored, but which theorists have been talking about for some years, is one in which a high-energy process associated with a new particle is typically accompanied by an unusually large number of very low-energy particles (typically photons or hadrons with energy below a few GeV). The high-energy process is mimicked by certain common processes that occur in the Standard Model, and consequently the signal is drowned out, like a child’s voice in a crowded room. But the haze of a large number of low-energy particles that accompanies the signal is rare in the mimicking processes, so by keeping only those collisions that show something like this haze, it becomes possible to throw out the mimicking process most of the time, making the signal stand out — as though, in trying to find the child, one could identify a way to get most of the people to leave the room, reducing the noise enough for the child’s voice to be heard. [For experts: The most classic example of this situation arises in certain types of objects called “quirks”, though perhaps there are other examples. For non-experts: I’ll explain what quirks are some other time; it’s a sophisticated story.]

I was pleased that there was lively discussion on all of these four points; that’s essential for a good workshop.

After me there were talks by ATLAS expert Erez Etzion and CMS’s Steve Wurm, surveying a large number of searches for new particles and other phenomena by the two experiments. One new result that particularly caught my eye was a set of CMS searches for new very heavy particles that decay to pairs of W and/or Z particles.  The W and Z particles go flying outwards with tremendous energy, and form the kind of jet-like objects I mentioned yesterday in the context of Jesse Thaler’s talk on “jet substructure”.  This and a couple of other related measurements are reflective of our moving into a new era, in which detection of jet-like W and Z particles and jet-like top quarks has become part of the standard toolbox of a particle physicist.

The workshop concluded with three hour-long panel discussions:

  1. on the possible interplay between dark matter and LHC research (for instance: how production of “friends” of dark matter [i.e., particles that are somehow related to dark matter particles] may be easier to detect at the LHC than production of dark matter itself)
  2. on the highest priorities for the 2013-2014 shutdown period before the LHC restarts (for instance, conversations between theorists and experimentalists about the trigger strategies that should be used in the next LHC run)
  3. on what the opportunities of the 2015-2020 run of the LHC are likely to be, and what their implications may be (for instance, the ability to finally reach the 3 TeV/c2 mass range for the types of particles one would expect in the so-called “Randall-Sundrum” class of extra-dimensions models; the opportunities to look for very rare Higgs, top and W decays; and the potential to complete the program I outlined above of ruling out all but a very small class of natural variants of supersymmetry.)

All in all, a useful workshop — but its true value will depend on how much we all follow up on what we discussed.

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON August 23, 2013

Day 2 of the SEARCH workshop will get a shorter description than it deserves, because I’ve had to spend time finishing my own talk for this morning. But there were a lot of nice talks, so let me at least tell you what they were about.

Both ATLAS and CMS presented their latest results on searches for supersymmetry. (I should remind you that “searches for supersymmetry” are by no means actually limited to supersymmetry — they can be used to discover or exclude many other new particles and forces that have nothing to do with supersymmetry at all.) Speakers Pascal Pralavorio and Sanjay Padhi gave very useful overviews of the dozens of searches that have been done so far as part of this effort, including a few rather new results that are very powerful. (We should see even more appear at next week’s Supersymmetry conference.) My short summary: almost everything easy has been done thoroughly; many challenging searches have also been carried out; if superpartner particles are present, they’re either

  • so heavy that they aren’t produced very often (e.g. gluinos)
  • rather lightweight, but still not so often produced (e.g. top squarks, charginos, neutralinos, sleptons)
  • produced often, but decaying in some way that is very hard to detect (e.g. gluinos decaying only to quarks, anti-quarks and gluons)

Then we had a few talks by theorists. Patrick Meade talked about how unknown particles that are affected by weak nuclear and electromagnetic forces, but not by strong nuclear forces, could give signs that are hiding underneath processes that occur in the Standard Model. (Examples of such particles are the neutralinos and charginos or sleptons of supersymmetry.) To find them requires increased precision in our calculations and in our measurements of processes where pairs of W and/or Z and/or Higgs particles are produced. As a definite example, Meade noted that the rate for producing pairs of W particles disagrees somewhat from current predictions based on the Standard Model, and emphasized that this small disagreement could be due to new particles (such as top squarks, or sleptons, or charginos and neutralinos) although at this point there’s no way to know.

Matt Reece gave an analogous talk about spin-zero quark-like particles that do feel strong nuclear forces, the classic example of which are top squarks. Again, the presence of these particles can be hidden underneath the large signals from production of top quark/anti-quark pairs, or other common processes. ATLAS and CMS have been working hard to look for signals of these types of particles, and have made a lot of progress, but there are still quite a few possible signals that haven’t been searched for yet. Among other things, Reece discussed some methods invented by theorists that might be useful in contributing to this effort. As with the previous talk, the key to a complete search will be improvements in calculations and measurements of top quark production, and of other processes that involve known particles.

After lunch there was a more general discussion about looking for supersymmetry, including conversation about what variants of supersymmetry haven’t yet been excluded by existing ATLAS and CMS searches.  (I had a few things to say about that in my talk, but more on that tomorrow.)

Jesse Thaler gave a talk reviewing the enormous progress that has been made in understanding how to distinguish ordinary jets arising from quarks and gluons versus jet-like objects made from a single high-energy W, Z, Higgs or top quark that decays to quarks and anti-quarks. (The jargon is that the trick is to use “jet substructure” — the fact that inside a jet-like W are two sub-jets, each from a quark or anti-quark.) At SEARCH 2012, the experimenters showed very promising though preliminary results using a number of new jet substructure methods that had been invented by (mostly) theorists. By now, the experimenters have shown definitively that these methods work — and will continue to work as the rate of collisions at the LHC grows — and have made a number of novel measurements using them. Learning how to use jet substructure is one of the great success stories of the LHC era, and it will continue to be a major story in coming years.

Two talks by ATLAS (Leandro Nisanti) and CMS (Matt Hearndon) followed, each with a long list of careful measurements of what the Standard Model is doing, mostly based so far only on the 2011 data set (and not yet including last year’s data). These measurements are crucially important for multiple reasons:

  • They provide important information which can serve as input to other measurements and searches.
  • They may reveal subtle problems with the Standard Model, due to indirect or small effects from unknown particles or forces.
  • Confirming that measurements of certain processes agree with theoretical predictions gives us confidence that those predictions can be used in other contexts, in particular in searches for unknown particles and forces.

Most, but not all, theoretical predictions for these careful measurements have worked well. Those that aren’t working so well are of course being watched and investigated carefully — but there aren’t any discrepancies large enough to get excited about yet (other than the top quark forward-backward asymmetry puzzle, which wasn’t discussed much today). In general, the Standard Model works beautifully — so far.

The day concluded with a panel discussion focused on these Standard Model measurements. Key questions discussed included: how do we use LHC data to understand the structure of the proton more precisely, and how in turn does that affect our searches for unknown phenomena? In particular, a major concern is the risk of circularity; that a phenomenon from an unknown type of particle could produce a subtle effect that we would fail to recognize for what it is, instead misinterpreting it as a small misunderstanding of proton structure, or as a small problem with a theoretical calculation. Such are the challenges of making increasingly precise measurements, and searching for increasingly rare phenomena, in the complicated environment of the LHC.

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON August 22, 2013

Search

Buy The Book

Reading My Book?

Got a question? Ask it here.

Media Inquiries

For media inquiries, click here.