Of Particular Significance

Professor Peskin’s Four Slogans: Advice for the 2012 LHC

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 03/22/2012

On Monday, during the concluding session of the SEARCH Workshop on Large Hadron Collider [LHC] physics (see also here for a second post), and at the start of the panel discussion involving a group of six theorists, Michael Peskin, professor of theoretical particle physics at the Stanford Linear Accelerator Center [and my Ph.D. advisor] opened the panel with a few powerpoint slides.  He entitled them: “My Advice in Four Slogans” — the advice in question being aimed at experimentalists at ATLAS and CMS (the two general-purpose experiments at the LHC) as to how they ought best to search for new phenomena at the LHC in 2012, and beyond. Since I agree strongly with his points (as I believe most LHC theory experts do), I thought I’d tell you those four slogans and explain what they mean, at least to me. [I’m told the panel discussion will be posted online soon.]

1. No Boson Left Behind

There is a tendency in the LHC experimental community to assume that the new particles that we are looking for are heavy — heavier than any we’ve ever produced before. However, it is equally possible that there are unknown particles that are rather lightweight, but have evaded detection because they interact very weakly with the particles that we already know about, and in particular very weakly with the quarks and antiquarks and gluons that make up the proton.

Peskin’s advice is thus a warning: don’t just rush ahead to look for the heavy particles; remember the lightweight but hard-to-find particles you may have missed.

The word “boson” here is a minor point, I think. All particles are either fermions or bosons; I’d personally say that Peskin’s slogan applies to certain fermions too.

2. Exclude Triangles Not Points

The meaning of this slogan is a less obscure than the slogan itself.  Its general message is this: if one is looking for signs of a new hypothetical particle which

  • is produced mostly or always in particle-antiparticle pairs, and
  • can decay in multiple ways,

one has to remember to search for collisions where the particle decays one way and the antiparticle decays a different way; the probability for this to occur can be high.  Most LHC searches have so far been aimed at those cases where both particle and anti-particle decay in the same way.  This approach can in some cases be quite inefficient.   In fact, to search efficiently, one must combine all the different search strategies.

Now what does this have to do with triangles and points?  If you’d like to know, jump to the very end of this post, where I explain the example that motivated this wording of the slogan.  For those not interested in those technical details, let’s go to the next slogan.

3. Higgs Implies Higgs in BSM

[The Standard Model is the set of equations used to predict the behavior of all the known particles and forces, along with the simplest possible type of Higgs particle (the Standard Model Higgs.) Any other phenomenon is by definition Beyond the Standard Model: BSM.]

 [And yes, one may think of the LHC as a machine for converting theorists’ B(SM) speculations into (BS)M speculations.]

One of the main goals of the LHC is to find evidence of one or more types of Higgs particles that may be found in nature.  There are two main phases to this search, Phase 1 being the search for the “Standard Model Higgs”, and Phase 2 depending on the result of Phase 1.  You can read more about this here.

Peskin’s point is that the Higgs particle may itself be a beacon, signalling new phenomena not predicted by the Standard Model. It is common in many BSM theories that there are new ways of producing the Higgs particle, typically in decays of as-yet-unknown heavy particles. Some of the resulting phenomena may be quite easy to discover, if one simply remembers to look!

Think what a coup it would be to discover not only the Higgs particle but also an unexpected way of making it! Two Nobel prize-winning discoveries for the price of one!!

Another equally important way to read this slogan (and I’m not sure why Peskin didn’t mention it — maybe it was too obvious, and indeed every panel member said something about this during the following discussion) is that everything about the Higgs particle needs to be studied in very great detail. Most BSM theories predict that the Higgs particle will behave differently from what is predicted in the Standard Model, possibly in subtle ways, possibly in dramatic ways. Either its production mechanisms or its decay rates, or both, may easily be altered. So we should not assume that a Higgs particle that looks at first like a Standard Model Higgs actually is a Standard Model Higgs. (I’ve written about this here, here and here.)  Even a particle that looks very much like a Standard Model Higgs may offer, through precise measurements, the first opportunity to dethrone the Standard Model.

4. BSM Hides Beneath Top

At the Tevatron, the LHC’s predecessor,  top quark/anti-quark pairs were first discovered, but were rather rare. But the LHC has so much energy per collision that it has no trouble producing these particles. ATLAS and CMS have each witnessed about 800,000 top quark/anti-quark pairs so far.

Of course, this is great news, because the huge amount of LHC data on top quarks from 2011 allowed measurements of the top quark’s properties that are far more precise than we had previously. (I wrote about this here.) But there’s a drawback. Certain types of new phenomena that might be present in nature may be very hard to recognize, because the rare collisions that contain them look too similar to the common collisions that contain a top quark/anti-quark pair.

Peskin’s message is that the LHC experimenters need to do very precise measurements of all the data from collisions that appear to contain the debris from top quarks, just in case it’s a little bit different from what the Standard Model predicts.

A classic example of this problem involves the search for a supersymmetric partner of a top quark, the “top squark”. Unlike the t’ quark that I described a couple of slogans back, which would be produced with a fairly high rate and would be relatively easy to notice, top squarks would be produced with a rate that is several times smaller. [Technically, this has to do with the fact that the t’ would have spin-1/2 and the top squark would have spin 0.] Unfortunately, if the mass of the top squark is not very different from the mass of the top quark, then collisions that produce top squarks may look very similar indeed to ones that produce top quarks, and it may be a big struggle to separate them in the data. The only way to do it is to work hard — to make very precise measurements and perhaps better calculations that can allow one to tell the subtle differences between a pile of data that contains both top quark/anti-quark pairs and top squark/anti-squark pairs, and a pile of data that contains no squarks at all.

Following up on slogan #2: An example with a triangle.

Ok, now let’s see why the second slogan has something to do with triangles.

One type of particle that has been widely hypothesized over the years is a heavy version of the top quark, often given the unimaginative name of “top-prime.” For short, top is written t, so top-prime is written t’. The t’ may decay in various possible ways. I won’t list all of them, but three important ones that show up in many speculative theories are

  • t’ → W particle + bottom quark   (t’ → Wb)
  • t’ → Z particle + top quark      (t’ → Zt)
  • t’ → Higgs particle + top quark    (t’ → ht)

But we don’t know how often t’ quarks decay to Wb, or to Zt, or to ht; that’s something we’ll have to measure. [Let’s call the probability that a t’ decays to Wb “P1”, and similarly define P2 and P3 for Zt and ht].

Of course we have to look for the darn thing first; maybe there is no t’. Unfortunately, how we should look for it depends on P1, P2, and P3, which we don’t know. For instance, if P1 is much larger than P2 and P3, then we should look for collisions that show signs of producing a t’ quark and a t‘ antiquark decaying as t’ → W+ b and t‘ → W b. Or if P2 is much larger than P1 and P3, we should look for t’ → Zt and t‘ → Z t.

Peskin's triangle for a t' quark; at each vertex the probabilty for the decay labeling the vertex is 100%, while at dead center all three decays are equally probable. One must search in a way that is sensitive to all the possibilities.

Peskin has drawn this problem of three unknown probabilities, whose sum is 1, as a triangle.  The three vertices of the triangle, labeled by Wb, Zt and ht, represent three extreme cases: P1=1 and P2=P3=0; P2=1 and P1=P3=0; and P3=1, P1=P2=0. Each point inside this triangle represents different possible non-zero values for P1, P2 and P3 (with P1+P2+P3 assumed to be 1.)  The center of the triangle is P1=P2=P3=1/3.

Peskin’s point is that if the experiments only look for collisions where both quark and antiquark decay in the same way

  • t’ → W+ b and t‘ → W b;
  • t’ → Zt and t‘ → Z t;
  • t’ → ht and t‘ → h t;

which is what they’ve done so far, then they’ll only be sensitive to the cases for which P1 is by far the largest, P2 is by far the largest, or P3 is by far the largest — the regions near the vertices of the triangle.  But we know a number of very reasonable theories with P1=1/2 and P2=P3=1/4 — a point deep inside the triangle.  So the experimenters are not yet looking efficiently for this case.  Peskin is saying that to cover the whole triangle, one has add three more searches, for

  • t’ → W+ b and t‘ → Z t, or t’ → W  b and t’ → Zt;
  • t’ → W+ b and t‘ → h t, or t‘ → W b  and t’ → ht;
  • t’ → Zt and t‘ → h t, or t’ → ht or t‘ → Z t;

so as to cover that case (and more generally, the whole triangle) efficiently. Moreover, no one search is very effective; one has to combine them all six searches together.

His logic is quite general.  If you have a particle that decays in four different ways, the same logic applies but for a tetrahedron, and you need ten searches; if two different ways, it’s a line segment, and you need three searches.

Share via:

Twitter
Facebook
LinkedIn
Reddit

27 Responses

  1. I don’t know whether it’s just me or if everyone else encountering problems with your
    site. It looks like some of the written text within your posts are running off the screen. Can someone else please provide feedback and let me know if this
    is happening to them as well? This may be a issue with
    my internet browser because I’ve had this happen previously.
    Thank you

  2. Hi Matt, welcome back! It is not clear to me why it should be easier to detect super symmetric partner of very heavy top quark when they have failed to detect,as of today, SS partners of lighter particles, say electron, pion, proton, up and down quarks etc. Thanks.

    1. It’s a question of detail; the top superpartners can be, and in many models are, the lightest quark superpartner. The superpartners of particles that do not feel strong nuclear forces may be lighter, but they are harder to produce at a proton-proton collider, because quarks and gluons produce things that feel strong nuclear forces abundantly. Compared to a squark, an object of the same mass which does not feel strong nuclear forces are much more rarely produced (by factors of 100 or so.)

      1. Thanks for the reply. I understand the second point that non strongly interacting super partners would have smaller cross sections for production. But do I understand right that even amongst strongly interacting particles such as quarks, the super partners, whether they are heavier or not ,do not necessarily preserve mass hierarchy, i.e they could be all over?
        Is there a simple review on masses of SS particles?

        1. 1) the masses of superpartner particles could be all over the place, in principle. Data seems to indicate that the partners of the up and down quarks (which are easy to make, because of the up and down quarks in the proton) are probably above 1 TeV/c^2 in mass, but data on the top squark (much harder to make) is much less constraining.

          2) There is no simple review on the masses of superpartner particles, because although supersymmetry predicts they exist, supersymmetry is broken (or “hidden”, to say it better) in nature, and there are so many ways to break supersymmetry that the masses of the superpartners could be anywhere. The only constraint is that in order to address the hierarchy puzzle, certain key superpartners (Higgs partner, top partner, and gluon partner) must be below a few TeV/c^2 in mass.

  3. Hi Matt,

    what are in your opinion the best examples of light weighted bosons (following slogan 1) to look for that are currently being neglected by experimental searches?

  4. The first slogan reminds me a question I meant to ask before: are there interesting beyond SM ideas whose most prominent signatures would lie at energies below those of current experiments, and which were missed first time around (when detectors and analysis were much less advanced than today)? I know that extremely high precision measurements of atomic systems can sometimes be competitive (e.g. we have a group at my institution doing electron electric dipole moment experiments), but what about “high energy” but not as it were “LHC energy”?

    1. Yes, this is possible. I’ve written numerous papers about this [for what it’s worth, specifically in a large class of models called `hidden valleys’], as have a number of other people. There are in fact entire experimental programs devoted to examples of this:

      Here’s one:
      http://hallaweb.jlab.org/experiment/APEX/

      But sometimes having all that LHC energy is a *good* thing, even if you are looking for a particle that isn’t very heavy, because the easiest way to produce and/or observe the lightweight particle may be to find it in the decay of something which IS heavy. The Higgs particle, for example, may sometimes decay to currently unknown lightweight particles, which in turn may decay back to visible particles. See http://profmattstrassler.com/2012/01/27/exotic-decays-of-the-higgs-a-high-priority-for-2012/ for some discussion about the implications of this…

  5. I am confused about the triangle… because the experimental acceptance*efficiency for seeing any state at all given that the decay happened in the first place is small… maybe 0.01 or 0.05. That is why just about all discoveries involve `single tags’ where only one side of the event is reconstructed.

    To do both sides (`double tags’ ) of the event means you pay the experimental factor, squared. That is generally the reason why first observations use single tags, not double tags.

    Now, it can be that the discovery potential (from comparison of signal and background) is higher with double tags, nonetheless. But usually it hasn’t been historically… why do you and Prof. Peskin think it is now?

    1. Gnart, I can only speak to the (very similar) b’ searches, but the standard case of b’b’ -> Wt+Wt stands out so well from the backgrounds that it’s the first choice for discovery to use the information from both b’s.

      I’d say there’s no such thing as a general rule for discovery via number of tags-or-similar. It really depends on the *details* of both the new thing being looked for and the backgrounds. Summarizing it as experimental factor squared just isn’t enough information to know what the best method is going to be.

      1. I agree with Andre.

        In many current searches one does not search for a “single tag”. Also the acceptance times efficiency is often somewhat larger than 1% or 5% per “tag” with modern detectors.

        Even when the top quark was discovered, the fact that one could have one top quark decay to an antimuon, and a top antiquark decaying to an electron, was useful — the backgrounds to such a final state are very low. In other words, looking for the correlated effects of the two decays often can reduce background enough to increase the sensitivity of the search.

        Actually, as an example: the search at CMS for new particles each decaying to two quarks specifically looks for two pairs of jets with similar invariant mass — a double-tag approach — while the search for new particles each decaying to three quarks specifically looks for a single triplet of jets amid an event with six or more jets — a single-tag approach. Both methods have their merits.

      2. Both good answers… of course, it is always the comparison of signal level to background… but I guess the answer is in part the distinctiveness of 3 or 4 clear heavy particles in the final state, and perhaps, strong interaction production.

  6. Hi Matt,

    I’m wondering if you can say a little more about the “triangle”. It’s not obvious to me how looking for cases where the t’ and the bar(t’) decay differently moves one around within the triangle.

    For example, how does t’ -> W+ b and bar(t’) -> Z bar(t) OR t’ -> Z t and bar(t’) -> W- bar(b) move around the triangle rather than simply (t’ -> W+ b or Z t) and (bar(t’) -> W- bar(b) or Z bar(t)).

    Thanks!

    1. Good question; I wrote this too quickly.

      The logic is the following:

      A given t’ quark has three options for its decays: Wb, Zt or ht, with probabilities P1, P2 and P3.
      A given bar(t)’ antiquark has three options for ITS decays: W bar(b), Z bar(t) or h bar(t), WITH THE SAME PROBABILITIES P1, P2 and P3.

      So if you produce a t’ and a bar(t)’ in a collision, the decays of the t’ and bar(t)’ lead to 9 possible outcomes, because the decays of the t’ and the bar(t)’ are physically independent — the decay of the t’ doesn’t decay how the bar(t)’ decays. The nine outcomes are

      1 WW b bar(b) with probability P1^2
      2 WZ b bar(t) with probability P1*P2
      3 Wh b bar(t) with probability P1*P3
      4 ZW t bar(b) with probability P1*P2
      5 ZZ t bar(t) with probability P2^2
      6 Zh t bar(t) with probability P2*P3
      7 hW t bar(b) with probability P1*P3
      8 hZ t bar(t) with probability P2*P3
      9 hh t bar(t) with probability P3^2

      Note that the search strategy for processes 2 and 4 is essentially identical; the same is true for processes 3 and 7, and processes 6 and 8 look literally identical. So there are really only 6 searches needed for these 9 processes.

      If you are at the Wb corner of the triangle, then 1 ~ P1 >> P2 and P3, so only process 1 will occur very often, so you may as well only look for that. If you are at the Zt corner, then P2 is close to 1, and only process 5 will occur often.

      But if P1=P2=P3 = 1/3, at the center of the triangle, then all 9 processes are equally likely. If you only look for WW b bar(b), you’ll only pick up 1/9th of the events. And if you only look for WW b bar(b), Z Z t bar(t) and hh t bar(t), processes 1, 5 and 9, you only pick up 1/3 of the events.

      Does that make it clear?

  7. Please, Richard, if you want to hold silly ideas, you’re free to do so, but do not impose them on my other readers. To suggest the neutrino does not exist is really profoundly silly, as silly as suggesting that uranium atoms or bacteria don’t exist. Particle physics did not stop in 1930 with Pauli. Physicists make neutrino beams all the time, and use them to make measurements; we observe neutrinos coming out of nuclear reactors (but only when the reactors are turned on!); we observe them coming from the sun, and have imaged the sun in neutrinos; we observe them coming from cosmic rays; we observed them coming from supernova 1987a within two hours of the arrival of the light from that supernova; and we do all of these measurements kilometers underground, where no other known particles can penetrate.

    A beam of neutrinos is pointed at Kamiokande, running underground from Tsukuba, and the Super-Kamiokande detector observes THIS:
    http://www.ps.uci.edu/~tomba/sk/tscan/k2k-1999-fall/
    These circles indicate the presence of particles that point back toward Tsukuba. If there’s no beam of neutrinos, than what caused this data?

    A unique sequence of data obtained, far underground, about two hours before the light from supernova 1987a was first noticed:
    http://www-personal.umich.edu/~jcv/imb/imbp5.html
    These are said to be from neutrinos. If the only thing coming from supernovas is light, then please explain what caused this data?

    Neutrinos measured from a reactor; you can even see how the rate of detections goes up and down as predicted when one of the two reactors is switched on and off. (There’s data in their paper that shows how the rate of detections falls to almost zero when the reactors are both off.) When the reactors are full on, the number of neutrinos detected is about 50 per day!
    http://doublechooz.in2p3.fr/Status_and_News/status_and_news.php
    Ah, but you think neutrinos don’t exist. Fine; what, then, is this experiment (far underground) detecting 50 times a day, and why do they only see it when the reactors (a kilometer away) are turned on?

    I could go on like this for weeks. But I have better things to do.

  8. This post not only does report Peskin’s slogans but describes the mentality and reality of the mainstream physics of today. I do agree with Peskin in general and agree with your description too. However, I do read those slogans with slightly different views.

    Higgs Implies Higgs in BSM — there is no question about the fact that SM is not a final theory. Thus, there must be something in BSM. Before the Higgs is confirmed, the something in BSM might not be the Higgs.

    BSM Hides Beneath Top — this is truly the most exciting part of the physics today. Top could be the gateway to the new and the final physics, indeed.

    No Boson Left Behind — this is the issue about how much we know about physics. Where is the cutoff point for wondering about “Do we left a boson behind?” Awhile back, there were discussions about “What is Physics in terms of horse vs. cart or ideas vs. equations?” Indeed, this is the essential issue for physics. Without resolving this issue, there would be no true cutoff point for the question of “Do we left a boson behind?” In my view, there are two types of physics.
    1. Physics of Nature — this preexists humanity, the N-physics.

    2. Physics of human — man’s attempt to understand the physics of Nature, the H-physics.

    These two physics are dramatically different before their final unification. Today, the methodology and the epistemology of physics are the interplays of theories and testing. Thus, the H-physics can be divided further into two groups.
    A. Physics of theories and test, the T-physics.

    B. Physics of knowledge (the K-physics) — as soon as a “part” of the N-physics is learned via the T-physics, it becomes a “knowledge” (a part of the K-physics) which can no longer be challenged by any T-physics as it is a part of the N-physics.

    For example, OPERA can challenge Einstein’s theories as they are the T-physics but cannot challenge that the light speed is a Nature constant which is now belonging to the K-physics. Only by a firmly established K-physics, we can establish the cutoff point; enough is enough and know for sure that there is no boson left behind.

  9. Hi Matt! You might remember me from the 2005 U Washington REU. Your final (triangle) example is coincidentally very similar to my current analysis, a search for b’->Zb. It’ll be published soon, and you can see the one-slide public summary on page 21 of the Moriond “Other LHC Searches” linked below. Our analysis only cares about the decay of one of the b’s in the event, to good approximation, so it’s most sensitive the a BR(Zb) ~= 1 case, but maintains good power down to rather low BR(Zb).

    http://moriond.in2p3.fr/QCD/2012/MondayMorning/Sonnenschein.pdf
    The speaker misdefined beta, which is of course beta = 1 – (1 – BR)^2.

    1. Hello Andre! (For non-experts, “b” is for “bottom”.)

      Have you also looked for any sign of a decay to Higgs + b, where Higgs decays to leptons via taus? Remember slogan 3!! You might be sensitive to this, depending on how you did the analysis and how much background you’ve got…

      Also amusingly, a b’ quark with a decay to a Z particle and a b quark is exactly what I put into one of my simple simulated data sets for the self-annointed “LHC Olympics” (which I always thought would have been better named “LHC Calisthenics” — as it was a preliminary and oversimplified [but still instructive] training exercise for LHC theorists.)

      1. Oh, that’s funny about the Olympics!

        The closest we’ve come to a Higgs+b decay in this analysis is probably just plotting the dilepton mass in events with a b-jet. The spectrum has about as many statistically insignificant excesses as you’d expect for that many bins, so no dice. My personal philosophy for Higgs searches is that there are already (it seems) 500 people focusing on it on Atlas alone, and I want to stay well out of that bureaucratic mess.

        1. Well, bureaucratic it may be, but if you’ve done your own analysis properly, it hardly makes sense for someone else to do a Higgs search separately. It’s a waste of effort when there’s hardly enough people to go around to do all the important measurements…

          I assume you’ve looked in the electron+muon channel…

      2. Mmmm, I guess it would be very quick to just switch the dilepton mass window from around 91 GeV to around 125 GeV, and then let the rest of the analysis run.

        1. A little trickier than that because of those darn neutrinos, so you get a broad spectrum well below 125 rather than a narrow peak at 125. But the spectrum is known, and therefore easy to model; you could probably do some sort of fit.

          Actually, what I’m saying isn’t entirely right anyway; at this mass you get more di-lepton events from WW* than from tau pairs.

          But it’s still true that the spectrum is predicted precisely in the Standard Model, so you can still model it.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Search

Buy The Book

Reading My Book?

Got a question? Ask it here.

Media Inquiries

For media inquiries, click here.

Related

On my recent trip to CERN, the lab that hosts the Large Hadron Collider, I had the opportunity to stop by the CERN control centre

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 10/30/2024

Geneva, Switzerland, is not known for its sunny weather, and seeing the comet here was almost impossible, though I caught some glimpses. I hope many

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 10/21/2024