Tag Archives: searches

Off to CERN

After a couple of months of hard work on grant writing, career plans and scientific research, I’ve made it back to my blogging keyboard.  I’m on my way to Switzerland for a couple of weeks in Europe, spending much of the time at the CERN laboratory. CERN, of course, is the host of the Large Hadron Collider [LHC], where the Higgs particle was discovered in 2012. I’ll be consulting with my experimentalist and theorist colleagues there… I have many questions for them. And I hope they’ll have many questions for me too, both ones I can answer and others that will force me to go off and think for a while.

You may recall that the LHC was turned off (as planned) in early 2013 for repairs and an upgrade. Run 2 of the LHC will start next year, with protons colliding at an energy of around 13 TeV per collision. This is larger than in Run 1, which saw 7 TeV per collision in 2011 and 8 TeV in 2012.  This increases the probability that a proton-proton collision will make a Higgs particle, which has a mass of 125 GeV/c², by about a factor of 2 ½.  (Don’t try to figure that out in your head; the calculation requires detailed knowledge of what’s inside a proton.) The number of proton-proton collisions per second will also be larger in Run 2 than in Run 1, though not immediately. In fact I would not be surprised if 2015 is mostly spent addressing unexpected challenges. But Run 1 was a classic: a small pilot run in 2010 led to rapid advances in 2011 and performance beyond expectations in 2012. It’s quite common for these machines to underperform at first, because of unforeseen issues, and outperform in the long run, as those issues are solved and human ingenuity has time to play a role. All of which is merely to say that I would view any really useful results in 2015 as a bonus; my focus is on 2016-2018.

Isn’t it a bit early to be thinking about 2016? No, now is the time to be thinking about 2016 triggering challenges for certain types of difficult-to-observe phenomena. These include exotic, unexpected decays of the Higgs particle, or other hard-to-observe types of Higgs particles that might exist and be lurking in the LHC’s data, or rare decays of the W and Z particle, and more generally, anything that involves a particle whose (rest) mass is in the 100 GeV/c² range, and whose mass-energy is therefore less than a percent of the overall proton-proton collision energy. The higher the collision energy grows, the harder it becomes to study relatively low-energy processes, even though we make more of them. To be able to examine them thoroughly and potentially discover something out of place — something that could reveal a secret worth even more than the Higgs particle itself — we have to become more and more clever, open-minded and vigilant.

A Real Workshop

In the field of particle physics, the word “workshop” has a rather broad usage; some workshops are just conferences with a little bit of time for discussion or some other additional feature.  But some workshops are about WORK…. typically morning-til-night work.  This includes the one I just attended at the Perimeter Institute (PI) in Waterloo, Canada, which brought particle experimentalists from the CMS experiment (one of the two general-purpose experiments at the Large Hadron Collider [LHC] — the other being ATLAS) together with some particle theorists like myself.  In fact, it was one of the most productive workshops I’ve ever participated in.

The workshop was organized by the PI’s young theoretical particle physics professors, Philip Schuster and Natalia Toro, along with CMS’s current spokesman Joseph Incandela and physics coordinator Greg Landsberg. (Incandela, professor at the University of California at Santa Barbara, is now famous for giving CMS’s talk July 4th announcing the observation of a Higgs-like particle; ATLAS’s talk was given by Fabiola Gianotti. Landsberg is a senior professor at Brown University.) Other participants included many of the current “conveners” from CMS — typically very experienced and skilled people who’ve been selected to help supervise segments of the research program — and a couple of dozen LHC theorists, mostly under the age of 40, who are experienced in communicating with LHC experimenters about their measurements.  Continue reading

Is Supersymmetry Ruled Out Yet?

[A Heads Up: I’m giving a public lecture about the LHC on Saturday, April 28th, 1 p.m. New York time/10 a.m. Pacific, through the MICA Popular Talks series, held online at the Large Auditorium on StellaNova, Second Life; should you miss it, both audio and slides will be posted for you to look at later.]

Is supersymmetry, as a symmetry that might explain some of the puzzling aspects of particle physics at the energy scales accessible to the Large Hadron Collider [LHC], ruled out yet? If the only thing you’re interested in is the answer to precisely that question, let me not waste your time: the answer is “not yet”. But a more interesting answer is that many simple variants of supersymmetry are either ruled out or near death.

Still, the problem with supersymmetry — and indeed with any really good idea, such as extra dimensions, or a composite Higgs particle — is that such a basic idea typically can be realized in many different ways. Pizza is a great idea too, but there are a million ways to make one, so you can’t conclude that nobody makes pizza in town just because you can’t smell tomatoes. Similarly, to rule out supersymmetry as an idea, you can’t be satisfied by ruling out the most popular forms of supersymmetry that theorists have invented; you have to rule out all its possible variants. This will take a while, probably a decade.

That said, many of the simplest and popular variants of supersymmetry no longer work very well or at all. This is because of two things: (click here to read the rest of the article.)

Professor Peskin’s Four Slogans: Advice for the 2012 LHC

On Monday, during the concluding session of the SEARCH Workshop on Large Hadron Collider [LHC] physics (see also here for a second post), and at the start of the panel discussion involving a group of six theorists, Michael Peskin, professor of theoretical particle physics at the Stanford Linear Accelerator Center [and my Ph.D. advisor] opened the panel with a few powerpoint slides.  He entitled them: “My Advice in Four Slogans” — the advice in question being aimed at experimentalists at ATLAS and CMS (the two general-purpose experiments at the LHC) as to how they ought best to search for new phenomena at the LHC in 2012, and beyond. Since I agree strongly with his points (as I believe most LHC theory experts do), I thought I’d tell you those four slogans and explain what they mean, at least to me. [I’m told the panel discussion will be posted online soon.]

1. No Boson Left Behind

There is a tendency in the LHC experimental community to assume that the new particles that we are looking for are heavy — heavier than any we’ve ever produced before. However, it is equally possible that there are unknown particles that are rather lightweight, but have evaded detection because they interact very weakly with the particles that we already know about, and in particular very weakly with the quarks and antiquarks and gluons that make up the proton.

Peskin’s advice is thus a warning: don’t just rush ahead to look for the heavy particles; remember the lightweight but hard-to-find particles you may have missed.

The word “boson” here is a minor point, I think. All particles are either fermions or bosons; I’d personally say that Peskin’s slogan applies to certain fermions too.

2. Exclude Triangles Not Points

The meaning of this slogan is a less obscure than the slogan itself.  Its general message is this: if one is looking for signs of a new hypothetical particle which

  • is produced mostly or always in particle-antiparticle pairs, and
  • can decay in multiple ways,

one has to remember to search for collisions where the particle decays one way and the antiparticle decays a different way; the probability for this to occur can be high.  Most LHC searches have so far been aimed at those cases where both particle and anti-particle decay in the same way.  This approach can in some cases be quite inefficient.   In fact, to search efficiently, one must combine all the different search strategies.

Now what does this have to do with triangles and points?  If you’d like to know, jump to the very end of this post, where I explain the example that motivated this wording of the slogan.  For those not interested in those technical details, let’s go to the next slogan.

3. Higgs Implies Higgs in BSM

[The Standard Model is the set of equations used to predict the behavior of all the known particles and forces, along with the simplest possible type of Higgs particle (the Standard Model Higgs.) Any other phenomenon is by definition Beyond the Standard Model: BSM.]

 [And yes, one may think of the LHC as a machine for converting theorists’ B(SM) speculations into (BS)M speculations.]

One of the main goals of the LHC is to find evidence of one or more types of Higgs particles that may be found in nature.  There are two main phases to this search, Phase 1 being the search for the “Standard Model Higgs”, and Phase 2 depending on the result of Phase 1.  You can read more about this here.

Peskin’s point is that the Higgs particle may itself be a beacon, signalling new phenomena not predicted by the Standard Model. It is common in many BSM theories that there are new ways of producing the Higgs particle, typically in decays of as-yet-unknown heavy particles. Some of the resulting phenomena may be quite easy to discover, if one simply remembers to look!

Think what a coup it would be to discover not only the Higgs particle but also an unexpected way of making it! Two Nobel prize-winning discoveries for the price of one!!

Another equally important way to read this slogan (and I’m not sure why Peskin didn’t mention it — maybe it was too obvious, and indeed every panel member said something about this during the following discussion) is that everything about the Higgs particle needs to be studied in very great detail. Most BSM theories predict that the Higgs particle will behave differently from what is predicted in the Standard Model, possibly in subtle ways, possibly in dramatic ways. Either its production mechanisms or its decay rates, or both, may easily be altered. So we should not assume that a Higgs particle that looks at first like a Standard Model Higgs actually is a Standard Model Higgs. (I’ve written about this here, here and here.)  Even a particle that looks very much like a Standard Model Higgs may offer, through precise measurements, the first opportunity to dethrone the Standard Model.

4. BSM Hides Beneath Top

At the Tevatron, the LHC’s predecessor,  top quark/anti-quark pairs were first discovered, but were rather rare. But the LHC has so much energy per collision that it has no trouble producing these particles. ATLAS and CMS have each witnessed about 800,000 top quark/anti-quark pairs so far.

Of course, this is great news, because the huge amount of LHC data on top quarks from 2011 allowed measurements of the top quark’s properties that are far more precise than we had previously. (I wrote about this here.) But there’s a drawback. Certain types of new phenomena that might be present in nature may be very hard to recognize, because the rare collisions that contain them look too similar to the common collisions that contain a top quark/anti-quark pair.

Peskin’s message is that the LHC experimenters need to do very precise measurements of all the data from collisions that appear to contain the debris from top quarks, just in case it’s a little bit different from what the Standard Model predicts.

A classic example of this problem involves the search for a supersymmetric partner of a top quark, the “top squark”. Unlike the t’ quark that I described a couple of slogans back, which would be produced with a fairly high rate and would be relatively easy to notice, top squarks would be produced with a rate that is several times smaller. [Technically, this has to do with the fact that the t’ would have spin-1/2 and the top squark would have spin 0.] Unfortunately, if the mass of the top squark is not very different from the mass of the top quark, then collisions that produce top squarks may look very similar indeed to ones that produce top quarks, and it may be a big struggle to separate them in the data. The only way to do it is to work hard — to make very precise measurements and perhaps better calculations that can allow one to tell the subtle differences between a pile of data that contains both top quark/anti-quark pairs and top squark/anti-squark pairs, and a pile of data that contains no squarks at all.

Following up on slogan #2: An example with a triangle.

Ok, now let’s see why the second slogan has something to do with triangles.

One type of particle that has been widely hypothesized over the years is a heavy version of the top quark, often given the unimaginative name of “top-prime.” For short, top is written t, so top-prime is written t’. The t’ may decay in various possible ways. I won’t list all of them, but three important ones that show up in many speculative theories are

  • t’ → W particle + bottom quark   (t’ → Wb)
  • t’ → Z particle + top quark      (t’ → Zt)
  • t’ → Higgs particle + top quark    (t’ → ht)

But we don’t know how often t’ quarks decay to Wb, or to Zt, or to ht; that’s something we’ll have to measure. [Let’s call the probability that a t’ decays to Wb “P1”, and similarly define P2 and P3 for Zt and ht].

Of course we have to look for the darn thing first; maybe there is no t’. Unfortunately, how we should look for it depends on P1, P2, and P3, which we don’t know. For instance, if P1 is much larger than P2 and P3, then we should look for collisions that show signs of producing a t’ quark and a t‘ antiquark decaying as t’ → W+ b and t‘ → W- b. Or if P2 is much larger than P1 and P3, we should look for t’ → Zt and t‘ → Z t.

Peskin's triangle for a t' quark; at each vertex the probabilty for the decay labeling the vertex is 100%, while at dead center all three decays are equally probable. One must search in a way that is sensitive to all the possibilities.

Peskin has drawn this problem of three unknown probabilities, whose sum is 1, as a triangle.  The three vertices of the triangle, labeled by Wb, Zt and ht, represent three extreme cases: P1=1 and P2=P3=0; P2=1 and P1=P3=0; and P3=1, P1=P2=0. Each point inside this triangle represents different possible non-zero values for P1, P2 and P3 (with P1+P2+P3 assumed to be 1.)  The center of the triangle is P1=P2=P3=1/3.

Peskin’s point is that if the experiments only look for collisions where both quark and antiquark decay in the same way

  • t’ → W+ b and t‘ → W- b;
  • t’ → Zt and t‘ → Z t;
  • t’ → ht and t‘ → h t;

which is what they’ve done so far, then they’ll only be sensitive to the cases for which P1 is by far the largest, P2 is by far the largest, or P3 is by far the largest — the regions near the vertices of the triangle.  But we know a number of very reasonable theories with P1=1/2 and P2=P3=1/4 — a point deep inside the triangle.  So the experimenters are not yet looking efficiently for this case.  Peskin is saying that to cover the whole triangle, one has add three more searches, for

  • t’ → W+ b and t‘ → Z t, or t’ → W-  b and t’ → Zt;
  • t’ → W+ b and t‘ → h t, or t‘ → W- b  and t’ → ht;
  • t’ → Zt and t‘ → h t, or t’ → ht or t‘ → Z t;

so as to cover that case (and more generally, the whole triangle) efficiently. Moreover, no one search is very effective; one has to combine them all six searches together.

His logic is quite general.  If you have a particle that decays in four different ways, the same logic applies but for a tetrahedron, and you need ten searches; if two different ways, it’s a line segment, and you need three searches.

Taking Stock: Where is the Higgs Search Now?

Today, we got new information at the Moriond conference on the search for the Higgs particle (in particular, Phase 1 of the search, which involves the search for the simplest possible Higgs particle, called the “Standard Model Higgs”) from the Tevatron and the Large Hadron Collider [LHC], the Tevatron’s successor.  With those results in hand, and having had a little time to mull them over, let me give you a short summary.  If you want more details, read today’s earlier post and yesterday’s preparatory post.

Before I do that, let me make a remark.  There is a big difference between healthy skepticism and political denialism.  I get the impression that some people who are writing or reading other blogs misinterpret my caution with regard to experimental results as being somehow a political and unreasonable bias against the Higgs particle being present, either at a mass of 125 GeV/c2 or at all.  That’s ridiculous.  All that is going on is that I simply am not convinced yet by the data.  I’m a careful scientist… period.  And you’ll see that I’m consistent; later in this post I will advise you not to over-react negatively to what ATLAS didn’t see.

What happened today at the Moriond conference?

What did we learn?

The Tevatron experiments see a combined 2.2 standard deviation [2.2 “sigma”] excess in their search, consistent with a Standard Model Higgs particle with a mass anywhere in the range of 115 to 135 GeV/c2.  This is not inconsistent with the Higgs hints that we saw in December from the LHC experiments.  Here I am being perhaps overly careful in not saying, more positively, “it is consistent with the Higgs hints…” only because this measurement is intrinsically too crude to allow us to narrow in on 124-126 GeV, where ATLAS and CMS see their hints.  In short, the Tevatron measurement could, in the end, turn out to indicate a Higgs at a different mass than the one indicated by the current ATLAS and CMS hints.  Anyway, it’s a minor and mostly a semantic point.

The results from ATLAS were a bit of a shock.  In all three processes on which ATLAS reported, CMS has presented results already, and in each case CMS saw a small excess (1 standard deviation [1″sigma”], which is  small indeed.)  But ATLAS reported today that it sees essentially no excess in any of the three, and even a deficit in one of them for low mass.  This has a big effect.

  • First, it allows ATLAS to exclude a Standard Model Higgs all the way up to 122 GeV/c2 (except for a little window 1 GeV/c2 wide centered at 118) and down to 129 GeV/c2.  The only large window left for the Standard Model Higgs particle is 122-129, more or less centered around the hint at 126 GeV/c2 that they saw in December.
  • But second, the significance of the December hint, when combined with the new data that shows no excesses in these three new processes, drops by about a full standard deviation.  That’s a pretty big drop.

What does it all mean?

I think it basically means, roughly, status quo.  We got some positive information and some negative information today, and none of it is that easy to interpret.  So I think we are roughly where we were before, except that we probably no longer have to worry about any Standard Model Higgs below 122 GeV/c2.  Before today we had a decent hint of a Standard Model-like Higgs particle with a mass around 125 GeV/c2; we still have it.  Let me explain what I mean.

There are easy (relatively!) searches for the Higgs, and there are hard ones.  The easy searches are the ones where the backgrounds are relatively simple and the signal is a narrow peak on a plot.  There are two:

  1. Higgs decaying to  photons
  2. Higgs decaying to two lepton/anti-lepton pairs (often called “four leptons” for short)

Results on these were presented by both ATLAS and CMS back in December.  The hard searches are the ones where the backgrounds are rather complicated and the signal is quite broad, so that a mistake in estimating a background can either create a fake signal or hide a real one.    There are three (mainly) for a lightweight Higgs:

  1. Higgs decaying to a lepton, an anti-lepton, a neutrino and an anti-neutrino
  2. Higgs decaying to a tau lepton/anti-lepton pair
  3. Higgs decaying to a bottom quark/anti-quark pair

These are the three that ATLAS reported on today (where they saw no sign of a Higgs signal), and that CMS presented back in December (and saw a small excess in all three.)  [ATLAS presented a result on the first one in December, but only using part of their data; it showed a small excess at the time, but not now.]  The third process is the main one in which CDF and DZero reported an excess today, though the first one also plays a role in interpreting that excess.

In other words, everything we learned today had to do with the difficult searches — the ones that are hard to perform, hard to interpret, and hard to check.  And everything we learned was 1 or 2 sigma information; not very compelling even statistically.

For this reason,

  • I would not conclude that the new Tevatron results make the 125 GeV Higgs case much stronger
  • I would not conclude that the new ATLAS results make the 125 GeV Higgs case much weaker

For the same reason, when I explained why I was skeptical of the evidence back in December, I told you that in my view the CMS excesses in the difficult searches did not make the case for a 125 GeV Higgs much more compelling.  Since the easy searches at CMS do not show as large excesses as ATLAS’s do, I wasn’t really comfortable with the whole case from CMS.   Their case improved in January, when they added a bit more information from their easy search for two photons.

If, like me, you discount the difficult Higgs searches somewhat relative to the easy Higgs ones, then almost nothing has changed, as far as the current Higgs hints, after today’s up and down information.  The excess in the two easy searches at ATLAS is still there, and there are excesses at CMS at least in the two-photon search.  Even from the beginning, I gave you good reasons to think the ATLAS’s easy-search excesses were a bit larger than they should be, probably due to an upward statistical fluctuation in the background.    Conversely I think now that one should not overstate how bad today’s ATLAS news is for the Higgs hints.  It’s still quite reasonable to think there may be a Standard Model Higgs there at 125 GeV/c2.  There’s some evidence in its favor, and it’s certainly not ruled out at this point. (Whereas now, almost all other masses are.)

So as usual I advise patience and calm and no hyperventilating; the 2012 data will settle the issue.  Either there is a Standard Model Higgs with a mass within a few percent of 125 GeV/c2 , or we’ll soon be fanning out in Phase 2 of the Higgs search, looking for all the other types of Higgs particles that might be out there.

News from La Thuile, with Much More to Come

At various conferences in the late fall, the Large Hadron Collider [LHC] experiments ATLAS and CMS showed us many measurements that they made using data they took in spring and summer of 2011. But during the fall their data sets increased in size by a factor of two and a half!  So far this year the only results we’d seen that involved the 2011 full data set had been ones needed in the search for the Higgs particle. Last week, that started to change.

The spring flood is just beginning. Many new experimental results from the LHC were announced at La Thuile this past week, some only using part of the 2011 data but a few using all of it, and more and more will be coming every day for the next couple of weeks. And there are also new results coming from the (now-closed) Tevatron experiments CDF and DZero, which are completing many analyses that use their full data set. In particular, we’re expecting them to report on their best crack at the Higgs particle later this week. They can only hope to create controversy; they certainly won’t be able to settle the issue as to whether there is or isn’t a Higgs particle with a mass of about 125 GeV/c2, as hints from ATLAS and CMS seem to indicate.  But all indications are that it will be an interesting week on the Higgs front.

The Top Quark Checks In

Fig. 1: In the Higgs mechanism, the W particle gets its mass from the non-zero average value of the Higgs field. A precise test of this idea arises as follows. When the top quark decays to a bottom quark and a W particle, and the W then decays to an anti-neutrino and an electron or muon, the probability that the electron or muon travels in a particular direction can be predicted assuming the Higgs mechanism. The data above shows excellent agreement between theory and experiment, validating the notion of the Higgs field.

There are now many new measurements of the properties of the top quark, poking and prodding it from all sides (figuratively)  to see if it behaves as expected within the “Standard Model of particle physics” [the equations that we use to describe all of the known particles and forces of nature.] And so far, disappointingly for those of us hoping for clues as to why the top quark is so much heavier than the other quarks, there’s no sign of anything amiss with those equations. Top quarks and anti-quarks are produced in pairs more or less as expected, with the expected rate, and moving in the expected directions with the expected amount of energy. Top quark decay to a W particle and a bottom quark also agrees, in detail, with theoretical expectation.  Specifically (see Figure 1) the orientation of the W’s intrinsic angular momentum (called its “spin”, technically), a key test of the Standard Model in general and of the Higgs mechanism in particular, agrees very well with theoretical predictions.  Meanwhile there’s no sign that there are unexpected ways of producing top quarks, nor any sign of particles that are heavy cousins of the top quark.

One particularly striking result from CMS relates to the unexpectedly large asymmetry in the production of top quarks observed at the Tevatron experiments, which I’ve previously written about in detail. The number of top quarks produced moving roughly in the same direction as the proton beam is expected theoretically to be only very slightly larger than the number moving roughly in the same direction as the anti-proton beam, but instead both CDF and DZero observe a much larger effect. This significant apparent discrepancy between their measurement and the prediction of the Standard Model has generated lots of interest and hope that perhaps we are seeing a crack in the Standard Model’s equations.

Well, it isn’t so easy for CMS and ATLAS to make the same measurement, because the LHC has two proton beams, so it is symmetric front-to-back, unlike the Tevatron with its proton beam and anti-proton beam.   But still, there are other related asymmetries that LHC experiments can measure. And CMS has now looked with its full 2011 data set, and observes… nothing: for a particular charge asymmetry that they can measure, they find an asymmetry of 0.4% +- 1.0% +- 1.2% (the first number is the best estimate and the latter two numbers are the statistical and systematic uncertainties on that estimate).  The Standard Model predicts something of order a percent or so, while many attempts to explain the Tevatron result might have predicted an effect of several percent.  (ATLAS has presented a similar measurement but only using part of the 2011 data set, so it has much larger uncertainties at present.)

Now CMS is not measuring quite the same thing as CDF and DZero, so the CMS result is not in direct conflict with the Tevatron measurements. But if new phenomena were present that were causing the CDF and DZero’s anomalously large asymmetry, we’d expect that by now they’d be starting to show up, at least a little bit, in this CMS measurement.  The fact that CMS sees not a hint of anything unexpected considerably weakens the overall case that the Tevatron excess asymmetry might have an exciting explanation. It suggests rather that the whole effect is really a problem with the interpretation of the Tevatron measurements themselves, or with the ways that the equations of the Standard Model are used to predict them. That is of course disappointing, but it is still far too early to declare the case closed.

There’s also a subtle connection here with the recent bolstering by CDF of the LHCb experiment’s claim that CP violation is present in the decays of particles called “D mesons”. (D mesons are hadrons containing a charm quark [or anti-quark], an up or down anti-quark [or quark], and [as for all hadrons] lots of additional gluons and quark/anti-quark pairs.) The problem is that theorists, who used to be quite sure that any such CP violation in D mesons would indicate the presence of new phenomena not predicted by the Standard Model, are no longer so sure. So one needs corroborating information from somewhere, showing some other related phenomenon, before getting too excited.

One place that such information might have come from is the top quark.  If there is something surprising in charm quarks (but not in bottom quarks) one might easily imagine that perhaps there is something new affecting all up-type quarks (the up quark, charm quark and top quark) more than the down-type quarks (down, strange and bottom.)  [Read here about the known elementary particles and how they are organized.] In other words, if the charm quark is different from expectations and the bottom quark is not, it would seem quite reasonable that the top quark would be even more different from expectations. But  unfortunately, the results from this week suggest the top quark, to the level of precision that can currently be mustered, is behaving very much as the Standard Model predicted it would.

Meanwhile Nothing Else Checks In

Meanwhile, in the direct search for new particles not predicted by the Standard Model, there were a number of new results from CMS and ATLAS at La Thuile. The talks on these subjects went flying by; there was far too little information presented to allow understanding of any details, and so without fully studying the corresponding papers I can’t say anything more intelligent yet than that they didn’t see anything amiss. But of course, as I’ve suggested many times, searches of this type wouldn’t be shown so soon after the data was collected if they indicated any discrepancy with theoretical prediction, unless the discrepancy was spectacularly convincing. More likely, they would be delayed a few weeks or even months, while they were double- and triple-checked, and perhaps even held back for more data to be collected to clarify the situation. So we are left with the question as to which of the other measurements that weren’t shown are appearing later because, well, some things take longer than others, and which ones (if any) are being actively held back because they are more … interesting. At this preliminary stage in the conference season it’s too early to start that guessing game.

Fig. 2: The search for a heavy particle that, like a Z particle, can decay to an electron/positron pair or a muon/anti-muon pair now excludes such particles to well over 1.5 TeV/c-squared. The Z particle itself is the bump at 90 GeV; any new particle would appear as a bump elsewhere in the plot. But above the Z mass, the data (black dots) show a smooth curve with no significant bumps.

So here’s a few words about what ATLAS and CMS didn’t see. Several classic searches for supersymmetry and other theories that resemble it (in that they show signs of invisible particles, jets from high-energy quarks and gluons, and something rare like a lepton or two or a photon), were updated by CMS for the full or near-full data set. Searches for heavy versions of the top and bottom quark were shown by ATLAS and CMS. ATLAS sought heavy versions of the Z particle (see Figure 2) that decay to a high energy electron/positron pair or muon/anti-muon pair; with their full 2011 data set, they now exclude particles of this type up to masses (depending on the precise details of the particle) of 1.75-1.95 TeV/c2. Meanwhile CMS looked for heavy versions of the W particle that can decay to an electron or muon and something invisible; the exclusions reach out above 2.5 TeV/c2. Other CMS searches using the full data set included ones seeking new particles decaying to two Z particles, or to a W and a Z.   ATLAS looked for a variety of exotic particles, and CMS looked for events that are very energetic and produce many known particles at once.  Most of these searches were actually ones we’d seen before, just updated with more data, but a few of them were entirely new.

Two CMS searches worth noting involved looking for new undetectable particles recoiling against a single jet or a single photon. These put very interesting constraints on dark matter that are complementary to the searches that have been going on elsewhere, deep underground.  Using vats of liquid xenon or bubble chambers or solid-state devices, physicists have been looking for the very rare process in which a dark matter particle, one among the vast ocean of dark matter particles in which our galaxy is immersed, bumps into an atomic nucleus inside a detector and makes a tiny little signal for physicists to detect. Remarkable and successful as their search techniques are, there are two obvious contexts in which they work very poorly. If dark matter particles are very lightweight, much lighter than a few GeV/c2, the effect of one hitting a nucleus becomes very hard to detect. Or if the nature of the interaction of dark matter with ordinary matter is such that it depends on the spin (the intrinsic angular momentum) of a nucleus rather than on how many protons and neutrons the nucleus contains, then the probability of a collision becomes much, much lower. But in either case, as long as dark matter is affected by the weak nuclear force, the LHC can produce dark matter particles, and though ATLAS and CMS can’t detect them, they can detect particles that might sometimes recoil against them, such as a photon or a jet. So CMS was quite proud to show that their results are complementary to those other classes of experiments.

Fig. 3: Limits on dark matter candidates that feel the weak nuclear force and can interact with ordinary matter. The horizontal axis gives the dark matter particle's mass, the vertical mass its probability to hit a proton or neutron. The region above each curve is excluded. All curves shown other than those marked "CMS" are from underground experiments searching for dark matter particles hitting an atomic nucleus. CMS searches for a jet or a photon recoiling against something undetectable provide (left) the best limits on "spin-independent" interactions for masses below 3.5 GeV/c-squared, and (right) the best limits on "spin-dependent" interactions for all masses up to a TeV/c-squared.

Finally, I made a moderately big deal back in October about a small excess in multi-leptons (collisions that produce three or more electrons, muons, positrons [anti-electrons] or antimuons, which are a good place to look for new phenomena), though I warned you in bold red letters that most small excesses go away with more data. A snippet of an update was shown at La Thuile, and from what I said earlier about results that appear early in the conference season, you know that’s bad news. Suffice it to say that although discrepancies with theoretical predictions remain, the ones seen in October apparently haven’t become more striking. The caveat that most small excesses go away applies, so far, to this data set as well. We’ll keep watching.

Fig. 4: The updated multilepton search at CMS shows (black solid curve) a two standard deviation excess compared to expectations (black dotted curve) in at least some regimes in the plane of the gluino mass (vertical axis) versus the chargino mass (horizontal axis) in a particular class of models. But had last fall's excess been a sign of new physics, the current excess would presumably have been larger.

Stay tuned for much more in the coming weeks!

Why a Lightweight Higgs is a Sensitive Creature — Part 2

[Note added:  It is official — as expected, at this year’s Chamonix workshop, where the Large Hadron Collider’s [LHC’s] future is planned out each year, it was decided that the LHC’s energy will be increased by 14% next year (from 3.5 TeV energy per proton and 7 TeV energy per collision in 2010-2011 to 4 TeV per proton and 8 per collision.) Also the time between collisions will remain at 50 nanoseconds.  I’ll have some things to say about the pros and cons of this decision, in particular the challenges for the experiments, over the next few days.]

On Monday last week, I gave you half the explanation as to why a lightweight Higgs particle is a sensitive creature, one that is easily altered by new phenomena — by particles and/or forces that we might not yet know about.  It all had to do with an analogy between a violin string (or a guitar string or a xylophone key) and the properties of the Higgs particle.   Today, on the same webpage as the first half, I have provided the second half of the story. (If you have already read the first half, just look for the boldface words “The Diverse Modes of a Higgs’ Demise”, which separate last week’s prose from the new stuff.)  I’ve also added, for particle physicists and for those laypersons who want to go a little deeper, a short quantitative discussion of my main points.

Also: I will have the honor to be interviewed on Wednesday at 5 p.m. Eastern time, at

http://www.blogtalkradio.com/virtuallyspeaking/2012/02/15/matt-strassler-tom-levenson-virtually-speaking-science

which you can listen to either live or later.  My interviewer, Tom Levenson, is an eminent science journalist who has written fascinating and surprising books on Einstein and on Newton, among others, won awards for his work on television (e.g. NOVA), has a great blog (and also posts here), and is a professor of science writing at MIT.  In short, he’s a bright and interesting dude whom you should consider following on Twitter, or in whatever way floats your boat in the ocean of social media.  For this reason I suspect that the conversation is going to be a lot deeper and more interesting than the average interview, with the interviewer making at least as many interesting comments about the topic as the interviewee.

LHC as Juggernaut and Behemoth

Yesterday I spent the afternoon at the Third Indian-Israeli International Meeting on String Theory,  held at the Israel Institute for Advanced Studies.  The subject of the meeting is “Holography and its Applications”.  No, this isn’t holography as in that optical trick that allows you to create a three-dimensional image on the security strip of your credit card — this is “holography” as string theorists like to discuss it, that trick of describing gravitational or string-theoretic physics  in a certain number of spatial dimensions as quantum field theory (without gravity) in a smaller number of spatial dimensions.  It’s impressive, even stunning, that sometimes you can use a precise form of the holographic principle to solve some difficult string theory problems by rewriting them as easier quantum field theory problems, and solve some difficult quantum field theory problems by rewriting them as easier string theory problems.

I worked in this research area on and off for quite a while (mainly 1999-2007) so I know most of the participants in this subfield.  In fact my most commonly cited paper happens to be on this subject.  But ironically my role at this conference was to present, as the opening talk, a review of 2011 at the Large Hadron Collider (LHC). Continue reading