Of Particular Significance

Latest from the SEARCH Workshop

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 03/20/2012

Over the weekend I wrote about the SEARCH workshop’s first day; today I’ll describe its final two days. First I’ll give you a broad overview, and then, for more expert readers, a couple of especially interesting developments that caught my attention.

The vast amount of information pouring out of the Large Hadron Collider [LHC] is simply overwhelming. Sunday and Monday we heard 16 talks by LHC experimenters, evenly split between ATLAS and CMS, the two general purpose detectors at the LHC.  Each of these talks described several complex measurements aimed at looking for a wide variety of hypothetical phenomena — for any sign of speculative things that theorists have proposed (conceptual ideas such as supersymmetry and extra dimensions, and more generally, new types of particles including heavy“partners” of the top quark, undetectable particles [such as those that may make up dark matter], new particles that can decay to quark-antiquark pairs or lepton-antilepton pairs or pairs of photons, and so on.) So far none of these measurements has turned up anything unexpected, and it is appearing very unlikely that data from 2011, the first year of full-fledged LHC operation, will lead to an easy, quick and surprising discovery. But it is still very early in the LHC’s decade-long program, so collectively we just have to buckle down, take more data and work harder.

There were a number of very interesting discussion sessions, during which many useful (and mostly friendly and constructive) exchanges occurred between theorists and experimentalists and between members of ATLAS and CMS. Almost all of this was quite technical so I won’t give many details, except to say that I learned a lot, and also saw lots of places where I thought the experimentalists could extract more information from their data.

The workshop concluded with a panel discussion — the only point during the entire workshop when theorists were formally asked to say something.
The panel consisted of Michael Peskin (senior statesman [and my Ph.D. advisor] famous for many reasons, including fundamental work on the implications of highly precise measurements ), Nima Arkani-Hamed (junior statesman, and famous for helping develop several revolutionary new ways of approaching the hierarchy problem),  Riccardo Rattazzi (also famous for conceptual advances in dealing with the hierarchy problem), Gavin Salam (famous for his work advancing the applications of the theory of quarks and gluons, including revolutionary methods for dealing with jets), and myself (famous for talking too much… though come to think of it, that was true of the whole panel, except Gavin.) And Raman Sundrum, one of the organizers (and famous for his collaboration with Lisa Randall in introducing “warped” extra dimensions, and also anomaly-mediated supersymmetry breaking [which was competitive with a paper by Rattazzi and his colleagues]) informally participated too.  The discussion was recorded, and I assume they will post it. If I’m not too embarrassed by it I’ll provide the link. 🙂

Meanwhile I mentioned in my last post that I think there are big issues surrounding whether the LHC can effectively trigger on exotic decays of the Higgs particle (if there are any), under current operating conditions. Informally, a bunch of theorists interested in this question met yesterday afternoon, with a couple of experimental spectators. We tried to build a list of exotic Higgs decays for which (a) one can hope to make a measurement with 2012 data, and (b) it isn’t obvious that current triggering strategies will work very well, and (c) additional triggering strategies might conceivably improve the situation. Then I tried to encourage individual theorists to do pick one of these decays and do a quick study to see whether an interesting search really could be made with this year’s data, assuming triggering on these events were carried out. The experimenters interested in this issue have indicated to us that they need our work to be done within about a month — a very short time as far as even preliminary studies are concerned.  If any of my colleagues are reading this, please consider volunteering to help out.

Ok, now a few specifics from the workshop.  Oh, first a bon mot from Daniel Whiteson, member of ATLAS and professor at the University of California at Irvine. “With the current data, a Standard Model Higgs at 125 GeV is like Mitt Romney: the most likely option, but the least exciting.”

One important thing I learned from John Butterworth of ATLAS (who writes regularly [and very well indeed] for the Guardian newspaper, and is currently in charge of coordinating ATLAS’s precise measurements of many properties of the Standard Model) was about ATLAS’s recent demonstration that they can measure tau lepton “polarization” [polarization = whether the spin of the tau is aligned with or counter to its direction of motion]. It is very interesting to know whether taus produced in a particular process are preferentially polarized; in many Standard Model processes they are polarized in one way, but in many new phenomena they might end up polarized the other way or on average unpolarized. The ability to measure this so well reflects how extraordinary ATLAS is as a detector compared to the previous generation of detectors. Though they’re not ready to say so publicly, I am sure CMS will also be able to do this too, as its capabilities are generally quite comparable to those of ATLAS. Anyway, this is a new experimental tool that may be very important in future measurements, and theorists like me should get to work and give the experimenters new suggestions on how to use it!

Another new tool that has been coming for a while involves the study of  jet substructure.  We heard about this as applied to “top-tagging” — identifying top quarks when they are moving so fast that the particles to which they decay create jets that kind of overlap. The new jet technology that was developed in the last decade, by many people, has allowed a number of new techniques to be suggested, and Petar Maksimovic (from Johns Hopkins University) of CMS gave a beautiful talk, in which he showed that a particular variant of these techniques (developed by his theory colleagues at Johns Hopkins [Kaplan, Reherman, Schwartz and Tweedie, who were inspired by work of Butterworth, Davidson, Rubin and Salam]) seem to work when applied in real data. It’s too early to be sure just how well things work, but it looks very promising, as you can see in the photo below, where Maksimovich shows that W particles decaying to a nearly-overlapping quark and antiquark can be identified as an obvious and rather narrow peak above more random sources of similar looking jets.

Petar Maksimovic shows that the new techniques for identifying and measuring jet substructure can easily find W particles within top quark decays.

Finally, a very interesting comment was made by theorist George Sterman, one of the world’s greatest experts on the theory of quarks and gluons. When I visited SUNY Stony Brook last fall, I had a conversation with him in which he suggested a particular reason why the unexpectedly large asymmetry in the production of top quarks observed by the Tevatron experiments CDF and DZero might really be due just to a subtle mismatch between theoretical calculation and the experimental methodology. It was the most compelling idea I’d heard so far, but it wasn’t very precise yet. Apparently in just the last few days, Sterman managed to flesh this idea into something concrete.

CDF data on the top quark forward-backward asymmetry; in the plots at right, which show the asymmetry as a function of top quark-antiquark invariant mass (above) and rapidity (below), data is the black dots, the black line is a fit to the data, and the Standard Model prediction, which has the same shape but is smaller, is the green lines.

First, George referred to new updated results presented by CDF (see figure above) at the Moriond conference, which show that although there is an excess, it has the same shape [as a function of top quark-antiquark invariant mass and rapidity] as the asymmetry predicted in the Standard Model. As he said, “it looks just like the Standard Model, only more so.”  And then he argued that the complicated formula for the top quark asymmetry from quark-antiquark collisions (within the Standard Model) could be written, with certain well-defined approximations, in a simple form that

  • clarifies the reason for the shape of the Standard Model prediction
  • shows how any accidental or intentional removal of top quark-antiquark events that have an extra jet would increase the asymmetry as the logarithm of M2/Delta2, where M is the invariant mass of the top quark/antiquark pair, and Delta2 = S2-M2, where S is the invariant mass of the quark and antiquark that initiated the collision
  • suggests how this idea could potentially be tested [by changing the cuts on the extra jet and thus changing the average M2/Delta2]

As he said himself, the idea is still a very new one and might be in contradiction with what the experimenters at CDF actually have done. But in any case it was classic theoretical physics; taking very complicated formulas, using a deep understanding of the problem to make clever simplifying approximations, and extracting a physically relevant, conceptually clear lesson. Kudos to Professor Sterman; I generally hope he’s right, because every other explanation that any of us have proposed using new particles looks awful, and we’d really like to put this mystery behind us soon.

George Sterman explains his insights into the top quark/antiquark forward-backward asymmetry.

Share via:

Twitter
Facebook
LinkedIn
Reddit

16 Responses

  1. Slightly off-topic, but you mention Arkani-Hamed and Sundrum, so could I ask your thoughts on them, please? Specifically, to what extent the absense of non-renormalizable terms in the SM is a problem for any theory with a small Planck mass.

    1. They’re among the best; creative, profound, wide-ranging thinkers with technical prowess.

      The absence of any signs of deviations from the Standard Model due to effects of higher-energy phenomena (the non-renormalizable operators you refer to), in a theory with quantum gravity at an energy scale much smaller than is normally assumed, is something which, of course, they are well aware of.

      In the version of extra dimensions pioneered by Arkani-Hamed, Dimopolous and Dvali, it was challenging but not impossible, back in 1998, to imagine that the effects of these higher-dimension operators could be small enough to have not shown up in experiment — though it was a long-shot. It is somewhat tougher to imagine this now.

      The version of extra dimensions pioneered by Randall and Sundrum in 1999 is different; it is easier to imagine those effects still being too small to be observed. Easier doesn’t mean easy, though.

      However, one should keep in mind that supersymmetry too “should” have shown up by now. In fact it should have shown up in the early nineties, according to the usual rough estimates.

      We theorists are pretty confused, but LHC investigations are still at a very early stage, so the confusion might be due to things that we haven’t learned from the data yet.

      By the way, neither Arkani-Hamed nor Sundrum is wedded to any one idea; they want to know what nature has to say. Like all great “model-builder” theorists, they have invented not one possible theory but many, in hopes of teasing out a small piece of the truth. (I’ve got several models of my own, none of which I believe very much, but any of which might turn out to be part of the story.) I learned long ago from Howard Georgi, professor at Harvard and one of the greatest model builders of all time, that the purpose of model-building is not to get it right the first time (the chances are minuscule) but to gain insights that together will lead the community to figure nature out. If that panel discussion is posted, you’ll hear this reflected in what they say.

      1. Yes, sorry, I was writing sloppily – I meant the ADD and RS models (and that approach in general), rather than the personalities themselves.(and I do know how to spell “absence”, honestly :-). I certainly wouldn’t want to sidetrack a good professional blog such as this from proper physics by debating or disparaging any particular working physicist.
        As a side note, I agree particularly with your last paragraph – I grow so tired of hearing people talk of scientists’ “own” theories, as if this is the way science is done – by battles between the particular pet theories of different scienists, who can countenance no alternative (of course this is indeed the way it is sometimes done, but it shouldn’t be!). Science is routinely the pursuit of knowledge and understanding by honest exploration of the available opportunities, without prejudice, but the public often seem to have a hard time appreciating this, or that scientists may genuinely subscribe to the principle. For those who have poured a career into SUSY, for example, the existence of squarks etc at the LHC may well be of particular personal interest, but for the rest of us, the interest is academic.

  2. Since the Higgs couples to mass, and our kind of mass described in the SM only constitutes a small fraction of the total mass known to be in the Universe… would it really be surprising if the Higgs has many unseen decays (to dark matter type particles) and were thus much broader than expected?

    As for excitement… Daya Bay handed a baton to LBNE in the past few weeks, maybe the US will still have a program during the many years between now and when the LHC achieves 10 or 14 TeV.

    1. Well — you want to be a little careful here. The particles we know about get their mass from the Higgs field. Most particles in nature may not; in particular, dark matter particles may not. I’ve emphasized this many places around this website; the Higgs is NOT the “universal giver of mass” that many articles you will read seem to imply. For example, in almost any theory the Higgs field is only partially responsible for the mass of the Higgs particle itself! (We don’t know which of the theories is right, so I don’t know yet where the rest of the Higgs mass comes from.)

      However, no, it would not be at all surprising if the Higgs could decay sometimes or often to dark matter, or to other types of as-yet-unknown essentially undetectable particles. Searches for an “invisible” [i.e., undetectably-decaying] Higgs are planned for this year; last year we didn’t have enough data yet. That’s one of many different types of searches for Higgs particles that will be necessary if a Standard-Model-type Higgs is ruled out with this year’s data.

      I haven’t had time to write about Daya Bay and the other experiments that tell us something new about neutrino mixing; it’s interesting but the LHC takes precedence right now in my mind.

  3. A Higgs particle at 125 GeV is a strange beast phenomenologically. Coupled with the new limits on new physics that Atlas and CMS have provided, it seems to paint a picture that would have been unthinkable a few years ago as generic models seem to introduce a degree of finetuning that begins to make people uncomfortable.

    What is the general feeling by conference members on this problem, assuming that it is indeed a straight forward Higgs scalar of the usual type?

  4. I have heard that LHC scientists say that they need more data.
    “we need more data”… “we need more data..”
    All of them seem to say the same words.
    The data of 2012 will be enough ? really?
    Their saying is very boring.
    I doubt what they will say after 2012.
    They will say “we have enough data” after 2012?
    How is your opinion?

    1. We need more data.

      I’m sorry you find this boring. Sometimes the truth is less exciting than lies and speculations. Which would you prefer?

      We are in the midst of a great human adventure, in which thousands of talented and committed scientists and engineers are devoting their days and their nights and their weekends in a great collective effort to operate humankind’s largest experimental facility, and to work in concert to gather and analyze as much data from this giant machine as efficiently and as thoroughly as possible, in order that at least one of the great puzzles that has interfered with our understanding of nature for eight decades might begin to be answered within the next year or two.

      And you are bored.

      I hope you understand why I am not sympathetic.

    2. To know for sure with data, we definitely need more data. Of course, there are some other ways of knowing for sure, and they might not need more data.

      We are now on our “final” step to the final physics, not at the beginning of physics. That is, we know physics in detail about 99.9999% about the total (final) physics. And, all those knowledge is the base for us to know for sure even without any additional data.

      Among all known physics knowledge, the Nature constants preempt all others, and the Alpha (Electron fine structure constant) ties (locks) three true Nature constants together. The light speed (c) defines the space and the time dimensions. The Planck constant (h-bar) defines the mass and angular dimensions. The electric charge is actually a derived constant from c and h-bar, and it defines the largest possible causal universe. As the “mass” dimension is already defined in h-bar, the Alpha-mechanism encompasses the Higgs mechanism, the see-saw mechanism and the all whatnot mass-rising mechanisms. Thus, as soon as the Alpha can be calculated theoretically, the physics which provides such a calculation will complete the final step for the final physics, even without the additional data for Higgs or for all other theories.

      1. In the late 1800s, people believed that physics had basically been finished, and there were just a few details to iron out. They were wrong. Aristotle tried to reason his way to the truth; the late Renaissance broke away from this idea, to the benefit of human knowledge.

        So I don’t agree with your point of view. We certainly know a lot less than 99.9999% (how can the universe be accelerating? what is dark matter? why are the quark and lepton masses what they are? what sets the Higgs particle’s mass? how many forces are there, and if there are only the four we know, why? and on and on and on…). It might turn out we know only .01%.

      1. We don’t know how much we know about physics; it might be 99% and it might be 1%. How can we find out? The only way is more data. When we’ve answered all the deep questions and experiment after experiment reveals nothing new, we’ll give up at some point. We’re nowhere near that now.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Search

Buy The Book

Reading My Book?

Got a question? Ask it here.

Media Inquiries

For media inquiries, click here.

Related

This week I’ll be at the University of Michigan in Ann Arbor, and I’ll be giving a public talk for a general audience at 4

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 12/02/2024

Particle physicists describe how elementary particles behave using a set of equations called their “Standard Model.” How did they become so confident that a set

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 11/20/2024