Tag Archives: LHCb

Visiting the Host Lab of the Large Hadron Collider

Greetings from Geneva, and CERN, the laboratory that hosts the Large Hadron Collider [LHC], where the Higgs particle was found by the physicists at the ATLAS and CMS experiments. Between jet lag, preparing a talk for Wednesday, and talking to many experimental and theoretical particle physicists from morning til night, it will be a pretty exhausting week.

The initial purpose of this trip is to participate in a conference held by the LHCb experiment, entitled “Implications of LHCb measurements and future prospects.” Its goal is to bring theoretical particle physicists and LHCb experimenters together, to exchange information about what has been and what can be measured at LHCb.

On this website I’ve mostly written about ATLAS and CMS, partly because LHCb’s measurements are often quite subtle to explain, and partly because the Higgs particle search, the highlight of the early stage of the LHC, was really ATLAS’s and CMS’s task. But this week’s activities gives me a nice opportunity to put the focus on this very interesting experiment, which is quite different from ATLAS and CMS both in its design and in its goals, and to explain its important role.

ATLAS and CMS were built as general purpose detectors, whose first goal was to find the Higgs particle and whose second was to find (potentially rare) signs of any other high-energy processes that are not predicted by the Standard Model, the equations we use to describe all the known particles and forces of nature. Crudely speaking, ATLAS and CMS are ideal for looking for new phenomena in the 100 to 5000 GeV energy range (though we won’t reach the upper end of the range until 2015 and beyond.)

LHCb, by contrast, was built to study in great detail the bottom and charm quarks, and the hadrons (particles made from quarks, anti-quarks and gluons) that contain them. These quarks and their antiquarks are produced in enormous abundance at the LHC. They and the hadrons that contain them have masses in the 1.5 to 10 GeV/c² range… not much heavier than protons, and much lower than what ATLAS and CMS are geared to study. And this is why LHCb has been making crucial high-precision tests of the Standard Model using bottom- and charm-containing hadrons.  (Crucial, but not, despite repeated claims by the LHCb press office, capable of ruling out supersymmetry, which no single measurement can possibly do.)

Although this is the rough division of labor among these experiments, it’s too simplistic to describe the experiments this way. ATLAS and CMS can do quite a lot of physics at the low mass range, and in some measurements can compete well with LHCb.   Less well-known is that LHCb may be able to do a small but critical set of measurements involving higher energies than is their usual target.

LHCb is very different from ATLAS and CMS in many ways, and the most obvious is its shape. ATLAS and CMS look like giant barrels centered on the location of the proton-proton collisions, and are designed to measure as many particles as possible that are produced in the collision of two protons. LHCb’s shape is more like a wedge, with one end surrounding the collision point.

Left: Cut-away drawing of CMS, which is shaped like a barrel with proton-proton collisions occurring at its center.  ATLAS's shape is similar. Right: the LHCb experiment is shaped something like a wedge, with collisions occurring at one end.

Left: Cut-away drawing of CMS, which is shaped like a barrel with proton-proton collisions occurring at its center. ATLAS’s shape is similar. Right: Cut-away drawing of LHCb, which is shaped something like a wedge, with collisions occurring at one end.

This shape only allows it to measure those particle that go in the “forward” direction — close to the direction of one of the proton beams. (“Backward” would be near the other beam; the distinction between forward and backward is arbitrary, because the two proton beams have the same properties. “Central” would be far from either beam.) Unlike ATLAS and CMS, LHCb is not used to reconstruct the whole collision; many of the particles produced in the collision go into backward or central regions which LHCb can’t observe.  This has some disadvantages, and in particular put LHCb out of the running for the Higgs discovery. But a significant fraction of the bottom and charm quarks produced in proton-proton collisions go “forward” or “backward”, so a forward-looking design is fine if it’s bottom and charm quarks you’re interested in. And such a design is a lot cheaper, too. It also means that LHCb  is well positioned to make some other measurements where the forward direction is important. I’ll give you one or two examples later in the week.

To make their measurements of bottom and charm quarks, LHCb makes use of the fact that these quarks decay after about a trillionth of a second (a picosecond) [or longer if, as is commonly the case, there is significant time dilation due to Einstein's relativity effects on very fast particles].  This is long enough for them to travel a measurable distance — typically a millimeter or more. LHCb is designed to make the measurements of charged particles with terrific precision, allowing them to infer a slight difference between the proton-proton collision point, from which most low-energy charged particles will emerge, and the location where some other charged particles may have been produced in the decay of a bottom hadron or some other particle that travels a millimeter or more before decaying. The ability to do precision “tracking” of the charged particles makes LHCb sensitive to the presence of any as-yet unknown particles that might be produced and then decay after traveling a small or moderate distance. More on that later in the week.

A computer reconstruction of the tracks in a proton-proton collision measured by LHCb.  Most tracks start at the proton-proton collision point, but the two tracks drawn in purple emerge from a different point, the apparent location of the decay of a hadron containing a bottom quark.

A computer reconstruction of the tracks in a proton-proton collision, as measured by LHCb. Most tracks start at the proton-proton collision point at left, but the two tracks drawn in purple emerge from a different point about 15 millimeters away, the apparent location of the decay of a hadron, whose inferred trajectory is the blue line, and whose mass (measured from the purple tracks) indicates that it contained a bottom quark.

One other thing to know about LHCb; in order to make their precise measurements possible, and to deal with the fact that they don’t observe a whole collision, they can’t afford to have too many collisions going on at once. ATLAS and CMS have been coping with ten to twenty simultaneous proton-proton collisions; this is part of what is known as “pile-up”. But near LHCb the LHC beams are adjusted so that the number of collisions at LHCb is often limited to just one or two or three simultaneous collisions. This has the downside that the amount of data LHCb collected in 2011 was about 1/5 of what ATLAS and CMS each collected, while for 2012 the number was more like 1/10.  But LHCb can do a number of things to make up for this lower rate; in particular their trigger system is more forgiving than that of ATLAS or CMS, so there are certain things they can measure using data of a sort that ATLAS and CMS have no choice but to throw away.

Did the LHC Just Rule Out String Theory?!

Over the weekend, someone said to me, breathlessly, that they’d read that “Results from the Large Hadron Collider [LHC] have blown string theory out of the water.”

Good Heavens! I replied. Who fed you that line of rubbish?!

Well, I’m not sure how this silliness got started, but it’s completely wrong. Just in case some of you or your friends have heard the same thing, let me explain why it’s wrong.

First, a distinction — one that is rarely made, especially by the more rabid bloggers, both those who are string lovers and those that are string haters. [Both types mystify me.] String theory has several applications, and you need to keep them straight. Let me mention two.

  1. Application number 1: this is the one you’ve heard about. String theory is a candidate (and only a candidate) for a “theory of everything” — a silly term, if you ask me, for what it really means is “a theory of all of nature’s particles, forces and space-time”. It’s not a theory of genetics or a theory of cooking or a theory of how to write a good blog post. But it’s still a pretty cool thing. This is the theory (i.e. a set of consistent equations and methods that describes relativistic quantum strings) that’s supposed to explain quantum gravity and all of particle physics, and if it succeeded, that would be fantastic.
  2. Application number 2: String theory can serve as a tool. You can use its mathematics, and/or the physical insights that you can gain by thinking about and calculating how strings behave, to solve or partially solve problems in other subjects. (Here’s an example.) These subjects include quantum field theory and advanced mathematics, and if you work in these areas, you may really not care much about application number 1. Even if application number 1 were ruled out by data, we’d still continue to use string theory as a tool. Consider this: if you grew up learning that a hammer was a religious idol to be worshipped, and later you decided you didn’t believe that anymore, would you throw out all your hammers? No. They’re still useful even if you don’t worship them.

BUT: today we are talking about Application Number 1: string theory as a candidate theory of all particles, etc. Continue reading

Some Weird Twists and Turns

In my last post, I promised you some comments on a couple of other news stories you may have seen.  Promise kept! see below.

But before I go there, I should mention (after questions from readers) an important distinction.  Wednesday’s post was about the simple process by which a Bs meson (a hadron containing a bottom quark and a down[typo] strange anti-quark, or vice versa, along with the usual crowd of gluons and quark/antiquark pairs) decays to a muon and an anti-muon.  The data currently shows nothing out of the ordinary there.  This is not to be confused with another story, loosely related but with crucially different details. There are some apparent discrepancies (as much as 3.7 standard deviations, but only 2.8 after accounting for the look-elsewhere effect) cropping up in details of the intricate process by which a Bd meson (a hadron containing a bottom quark and a down antiquark, or vice versa, plus the usual crowd) decays to a muon, an anti-muon, and a spin-one Kaon (a hadron containing a strange quark and a down anti-quark, or vice versa, plus the usual crowd). The measurements made by the LHCb experiment at the Large Hadron Collider disagree, in some but not all features, with the (technically difficult) predictions made using the Standard Model (the equations used to describe the known particles and forces.)

Don't confuse these two processes!  (Top) The process B_s --> muon + anti-muon, covered in Wednesday's post, agrees with Standard Model predictions.   (Bottom) The process B_d --> muon + anti-muon + K* is claimed to deviate by nearly 3 standard deviations from the Standard Model, but (as far as I am aware) the prediction and associated claim has not yet been verified by multiple groups of people, nor has the measurement been repeated.

Don’t confuse these two processes! (Top) The process B_s –> muon + anti-muon, covered in Wednesday’s post, agrees with Standard Model predictions. (Bottom) The process B_d –> muon + anti-muon + K* is claimed to deviate by nearly 3 standard deviations from the Standard Model, but (as far as I am aware) the prediction and associated claim has not yet been verified by multiple groups of people, nor has the measurement been repeated.

A few theorists have even gone so far as to claim this discrepancy is clearly a new phenomenon — the end of the Standard Model’s hegemony — and have gotten some press people to write (very poorly and inaccurately) about their claim.  Well, aside from the fact that every year we see several 3 standard deviation discrepancies turn out to be nothing, let’s remember to be cautious when a few scientists try to convince journalists before they’ve convinced their colleagues… (remember this example that went nowhere? …) And in this case we have them serving as judge and jury as well as press office: these same theorists did the calculation which disagrees with the data.  So maybe the Standard Model is wrong, or maybe their calculation is wrong.  In any case, you certainly musn’t believe the news article as currently written, because it has so many misleading statements and overstatements as to be completely beyond repair. [For one thing, it's a case study in how to misuse the word "prove".] I’ll try to get you the real story, but I have to study the data and the various Standard Model predictions more carefully first before I can do that with complete confidence.

Ok, back to the promised comments: on twists and turns for neutrinos and for muons…   Continue reading

A Couple of Rare Events

Did you know that another name for Minneapolis, Minnesota is “Snowmass”?  Just ask a large number of my colleagues, who are in the midst of a once-every-few-years exercise aimed at figuring out what should be the direction of the U.S. particle physics program.  I quote:

  • The American Physical Society’s Division of Particles and Fields is pursuing a long-term planning exercise for the high-energy physics community. Its goal is to develop the community’s long-term physics aspirations. Its narrative will communicate the opportunities for discovery in high-energy physics to the broader scientific community and to the government.

They are doing so in perhaps the worst of times, when political attacks on science are growing, government cuts to science research are severe, budgets to fund the research programs of particle physicists like me have been chopped by jaw-dropping amounts (think 25% or worse, from last year’s budget to this year’s — you can thank the sequester).. and all this at a moment when the data from the Large Hadron Collider and other experiments are not yet able to point us in an obvious direction for our future research program.  Intelligent particle physicists disagree on what to do next, there’s no easy way to come to consensus, and in any case Congress is likely to ignore anything we suggest.  But at least I hear Minneapolis is lovely in July and August!  This is the first Snowmass workshop that I have missed in a very long time, especially embarrassing since my Ph.D. thesis advisor is one of the conveners.  What can I say?  I wish my colleagues well…!

Meanwhile, I’d like to comment briefly on a few particle physics stories that you’ve perhaps seen in the press over recent days. I’ll cover one of them today — a measurement of a rare process which has now been officially “discovered”, though evidence for it was quite strong already last fall — and address a couple of others later in the week.  After that I’ll tell you about a couple of other stories that haven’t made the popular press… Continue reading

Higgs Workshop in Princeton

Today I’m attending the first day of a short workshop of particle theorists and experimentalists at the Princeton Center for Theoretical Science, a sort of “Where are we now and where are we going?” meeting. It’s entitled “Higgs Physics After Discovery”, but discussion will surely range more widely.

What, indeed, are the big questions facing particle physics in the short-term, meaning the next few months? Well, here are a few key ones:

  • A Higgs particle of some type has been discovered by the ATLAS and CMS experiments at the Large Hadron Collider [LHC] (with some contributions from the Tevatron experiments DZero and CDF); is it the simplest possible type of Higgs particle (the “Standard Model Higgs“) or is it more complex? What data analysis can be done on the LHC’s data from 2011-2012 to shed more light on this question?
  • More generally, from the LHC’s huge data set from 2011-2012 — specifically, from the data analysis that has been done so far — what precisely have we learned? (It’s increasingly important to go beyond the rougher estimates that were appropriate last year when the data was still pouring in.) What types of new phenomena have been excluded, and to what extent?
  • What other types of data analysis should be done on the 2011-2012 data, in order to look for other new phenomena that could still be lurking there? (There’s still a lot to be done on this question!) And what types of work should theoretical particle physicists do to help the experimentalists address this issue?
  • Several experiments from the Tevatron and the LHC, notably the LHCb experiment, have learned that newly measured decays of  certain mesons (hadrons with equal numbers of quarks and anti-quarks) that contain heavy quarks are roughly consistent with the Standard Model (the equations we use to describe the known elementary particles and forces, and a simplest type of Higgs field and Higgs particle.) How do these findings constrain the possibility of other new phenomena?
  • Looking ahead to 2015, when the LHC will begin running again at a higher energy per proton-proton collision, what preparations need to be made? Especially, what needs to be done to refine the triggering systems at ATLAS, CMS and LHCb, so that the maximum information can be extracted from the new data, and no important information is unnecessarily discarded?
  • Which, if any, of the multiple (but mostly mutually inconsistent) experimental hints of dark matter should be taken seriously? Which possibilities do the various dark matter experiments, and the LHC’s data, actually exclude or favor?

That might be it for the very near term. There are lots of other questions in the medium- to long-term, among which is the big question of what types of experiments should be done over the next 10 – 20 years. One challenge is that the LHC’s data hasn’t yet given us a clear target other than the Higgs particle itself. An obvious possible experiment to do is to study the Higgs in more detail, using an electron/anti-electron collider — historically this has been a successful strategy that has been used on almost every new apparently-elementary particle. But there are a lot of other possibilities, including raising the LHC’s collisions to even higher energy than we’ll see in 2015, using more powerful magnets currently under development.

If there are other near-term questions I’ve forgotten about, I’m sure I’ll be reminded at the workshop, and I’ll add them in.

Conclusion of the Higgs Symposium

By almost all measures, the Higgs Symposium at the University of Edinburgh, as part of the new Higgs Centre for Theoretical Physics, was a great success.  The only negative was that Professor Peter Higgs himself had a bad cold this week, and had to cancel his talk, as well as missing the majority of the talks by others.  Obviously all of us in attendance were very disappointed not to hear directly from him, and we wish him a speedy recovery.

Other than this big hole in the schedule, the talks given at the symposium seemed to me to form a coherent summary of where we are right now in our understanding of the Higgs field and particle.  They were full of interesting material, and wonderfully complementary to one another.  This motivates me to try to provide, for non-experts, some future articles on what the conference attendees had to say.  But to write such articles well takes time.  So for now, here’s the quick version summarizing the last few talks, along the lines of the summaries I wrote (here and here) of the earlier talks.  The slides from all the talks are posted here.

Here we go: Continue reading

It’s (not) The End of the World

The December solstice has come and gone at 11:11 a.m. London time (6:11 a.m New York time). That’s the moment when the north pole of the Earth points most away from the sun, and the south pole points most toward it. Because it’s followed by a weekend and then Christmas Eve, it marks the end of the 2012 blogging season, barring a major event between now and year’s end. But although 11:11 London time is the only moment of astronomical significance during this day (clearly the universe does not care where humans set our international date line and exactly how we set our time zones, so destruction was never going to be at local midnight — something the media doesn’t seem to get) it obviously wasn’t the end of the world.

A lot of people do put a lot of stock in prophecy, including prophecies of the end of the world that nobody ever made (such as the one not made for today by the Mayans, through their calendar) and others that people made but were wrong (such as those made by Harold Camping last year and by many throughout history who preceded him.) If anyone were any good at prophecy they’d be able to use their special knowledge to become billionaires, so maybe we should be watching Bill Gates and Michael Bloomberg and the Koch brothers and people like that. I haven’t heard any rumors of them building bunkers or spaceships yet. Of course at the end of the year they may get a small tax hike, but that wouldn’t be the end of the world.

The Large Hadron Collider [LHC], meanwhile, has triumphantly reached the end of its first run of proton-proton collisions. Goal #1 of the LHC was to allow physicists at the ATLAS and CMS experiments to discover the Higgs particle, or particles, or whatever took their place in nature; and it would appear that, in a smashing success, they have co-discovered one.  But no Higgs particles, or anything like them, will be produced again until 2015. Although the LHC will run for a short while in early 2013, it will do so in a different mode, smashing not protons but the nuclei of lead atoms together, in order to study the properties of extremely hot and dense matter, under conditions the universe hasn’t seen since the earliest stages of the Big Bang that launched the current era of our universe.  Then it will be closed down for repairs and upgrades.  So until 2015, any additional information we’re going to learn about the Higgs particle, or any other unknown particle that might have been produced at the LHC, is going to be obtained by analyzing the data that has been collected in 2011 and 2012. The total amount of data is huge; what was collected in 2012 was about 4.5 times as much as in 2011, and it was taken at 8 TeV of energy per proton-proton collision rather than 7 TeV as in 2011. I can assure you there will be many new things learned from analyzing that data throughout 2013 and 2014.

Of course a lot of people prophesied confidently that we’d discover supersymmetry, or something else dramatic, very early on at the LHC. Boy, were they wrong! Those of us who were cautioning against such optimistic statements are not sure whether to laugh or cry, because of course it would have been great to have such a discovery early in the LHC program. But there was ample reason to believe (despite what other bloggers sometimes say) that even if supersymmetry exists and is accessible to the LHC experiments, discovering it could take a lot longer than just two years!  For instance, see this paper written in 2006 pointing out that the search strategies being planned for seeking supersymmetry might fail in the presence of a few extra lightweight particles not predicted in the minimal variants of supersymmetry. As far as I can tell at present, this very big loophole has only partly been closed by the LHC studies done up to now. The same loophole applies for other speculative ideas, including certain variants of LHC-accessible extra dimensions. I am hopeful that these loopholes can be closed in 2013 and 2014, with additional analysis on the current data, but until they are, you should be very cautious believing those who claim that reasonable variants of LHC-accessible supersymmetry (meaning “natural variants of supersymmetry that resolve the hierarchy problem”) are ruled out by the LHC experiments. It’s just not true. Not yet. The only classes of theories that have been almost thoroughly ruled out by LHC data are those predict on general grounds that there should be no observable Higgs particle at all (e.g. classic technicolor).

While we’re on the subject, I’ve been looking back at how I did on prophecy this year. It’s been a remarkably good year, probably my best ever — though admittedly I only made very easy (though not necessarily common) predictions. First, the really easy one:  I assured you, as did most of my colleagues, that 2012 would be the Year of the Higgs — at least, the Year of the Simplest Possible Higgs particle, called the “Standard Model Higgs”. It would be the year when Phase 1 of the Higgs Search would end — when we’d either find a Higgs particle of Standard Model type (or something looking vaguely like it), or, if not, we’d know we’d have to move to a more aggressive search in Phase 2, in which we’d look for more complicated versions of the Higgs particle that would have been much harder to find. We started the year with ambiguous hints of the Higgs particle, too flimsy to be sure of, but certainly tantalizing, at around a mass of 125 GeV/c2. In July the hints turned into a discovery — somewhat faster than expected for a Standard Model Higgs particle, because the rate for this particle to appear in collisions that produce two photons was higher than anticipated. The excess in the photon signal means either the probability for the Higgs particle to decay to photons is larger than predicted for a Higgs of Standard Model type, or both CMS and ATLAS experienced a fortunate statistical fluctuation that made the discovery easier. We still don’t know which it was; though we’ll know more by March, this ambiguity may remain with us until 2015.

One prophecy I made all the way back at the beginning of this blog, July 2011, was that the earliest search strategy for the Higgs, through its decays to a lepton, anti-lepton, neutrino and anti-neutrino, wouldn’t end up being crucial in the discovery; it was just too difficult. (In this experimental context, “lepton” refers only to “electron” or “muon”; taus don’t count, for technical reasons.) In the end, I said, it would be decays of the Higgs to two photons and to two lepton/anti-lepton pairs that would be the critical ones, because they would provide a clean signal that would be uncontroversial. And that prophesy was correct; the photon-based and lepton-based searches were the signals that led to discovery.

Now we’ve reached December, and the data seems to imply that except possibly for this overabundance of photons, which still tantalizes us, the various measurements of how the Higgs-like particle is produced and decays are starting to agree, to a precision which is still only moderate, with the predictions of the Standard Model for a Higgs of this mass. Fewer and fewer experts are still suggesting that this is not a Higgs particle. But it will be some years yet — 2018 or later — before measurements are precise enough to start convincing people that this Higgs particle is really of Standard Model type. Many variants of the Standard Model, with new particles and forces, predict that the difference of the real Higgs from a Standard Model Higgs may be subtle, with deviations at the ten percent level or even less. Meanwhile, other Higgs-like particles, with different masses and different properties, might be hiding in the data, and it may take quite a while to track them down. Many years of data collecting and data analysis lie ahead, in Phase 2 of the Higgs search.

Another prophecy I made at the beginning of the year was that Exotic Decays of the Higgs would be a high priority for 2012. You might think this prophesy was wrong, because in fact, so far, there have been very few searches at ATLAS, CMS and LHCb for such decays. But the challenge that required prioritizing these decays wasn’t data analysis; it was the problem of even collecting the data. The problem is that many exotic decays of the Higgs would lead to events that might not be selected by the all-important trigger system that determines which tiny fraction of the LHC’s collisions to store permanently for analysis! At the beginning of 2012 there was a risk that some of these processes would have been dumped by the trigger and irretrievably lost from the 2012 data, making future searches for such decays impossible or greatly degraded. At a hadron collider like the LHC, you have to think ahead! If you don’t consider carefully the analyses you’ll want to do a year or two from now, you may not set the trigger properly today. So although the priority for data analysis in 2012 was to find the Higgs particle and measure its bread-and-butter properties, the fact that the Higgs has come out looking more or less Standard Model-like in 2012 means that focusing on exotic possibilities, including exotic decays, will be one of the obvious places to look for something new, and thus a very high priority for data analysis, in 2013 and 2014. And that’s why, for the trigger — for the collection of the data — exotic decays were a very high priority for 2012. Indeed, one significant use of the new strategy of delayed data streaming at ATLAS and of data parking at CMS (two names for the same thing) was to address this priority. [My participation in this effort, working with experimentalists and with several young theorists, was my most rewarding project of 2012.]  As I explained to you, a Higgs particle with a low mass, such as 125 GeV/c2, is very sensitive to the presence of new particles and forces that are otherwise very difficult to detect, and it easily could exhibit one or more types of exotic decays.  So there will be a lot of effort put into looking for signs of exotic decays in 2013 and 2014! I’m very excited about all the work that lies ahead of us.

Now, the prophecy I’d like to make, but cannot — because I do not have any special insight into the answer — is on the question of whether the LHC will make great new discoveries in the future, or whether the LHC has already made its last discovery: a Higgs particle of Standard Model type. Even if the latter is the case, we will need years of data from the LHC in order to distinguish these two possibilities; there’s no way for us to guess. It’s clear that Nature’s holding secrets from us.  We know the Standard Model (the equations we use to describe all the known particles and forces) is not a complete theory of nature, because it doesn’t explain things like dark matter (hey, were dark matter particles perhaps discovered in 2012?), and it doesn’t tell us why, for example, there are six types of quarks, or why the heaviest quark has a mass that is more than 10,000 times larger than the mass of the lightest quarks, etc. What we don’t know is whether the answers to those secrets are accessible to the LHC; does it have enough energy per collision, and enough collisions, for the job?  The only way to find out is to run the LHC, and to dig thoroughly through its data for any sign of anything amiss with the predictions of the Standard Model. This is very hard work, and it will take the rest of the decade (but not until the end of the world.)

In the meantime, please do not fret about the quiet in the tunnel outside Geneva, Switzerland. The LHC will be back, bigger and better (well, at least with more energy per collision) in 2015. And while we wait during the two year shutdown, the experimentalists at ATLAS, CMS, and LHCb will be hard at work, producing many new results from the 2011 and 2012 proton collision data! Even the experiments CDF and DZero from the terminated Tevatron are still writing new papers. In short, fear not: not only isn’t the December solstice of 2012 the end of the world, it doesn’t even signal a temporary stop to the news about the Higgs particle!

—-

One last personal note (just for those with some interest in my future.)

Details Behind Last Week’s Supersymmetry Story

Last week, I promised you I’d fill in the details of my statement that the recent measurement (of the rare process in which a Bs meson decays to a muon and an anti-muon — read here for the physics behind this process) by the LHCb experiment at the Large Hadron Collider [LHC] had virtually no effect on the constraints on any speculative theories, including supersymmetry, contrary to the statements in the press and by a certain LHCb member. Today I’m providing you with some sources for this statement.

A number of my colleagues have tasked themselves with keeping track of how measurements at the Large Hadron Collider and elsewhere are affecting certain subclasses of variants of the supersymmetry. They call themselves the “Mastercode Project”; here’s their website. They’re not the only ones looking at this, but among them is Professor Gino Isidori, whom I was talking to last week, so I’ve gotten this information from him. I quote from the MasterCode website regarding last week’s result from LHCb: “The new measurement provides a valuable new constraint on the supersymmetric parameter space, but the observation of a Standard Model-like branching fraction for the Bs→μ+μ- decay is quite consistent with supersymmetry. In fact, a Standard Model-like branching fraction of this decay was expected in constrained supersymmetric models like the CMSSM or NUHM1 (see, e.g., the recent MasterCode results for further details). As a result, the favoured regions in the parameter space of these models do not change significantly after the inclusion of the new constraint.

Now before I explain what this means, it’s important to have some terminology, running from most general to most specific.

Keep in mind that

  • ruling out the CMSSM or NUHM1 does not mean that the MSSM is ruled out;
  • ruling out the MSSM does not mean that supersymmetry at the TeV scale is ruled out;
  • ruling out supersymmetry at the TeV scale does not mean that supersymmetry is ruled out.

Among the many goals of the LHC is to find or rule out supersymmetry at the TeV scale. (It cannot hope to rule out supersymmetry altogether; that would presumably require a vastly more powerful collider that won’t likely be built for centuries, if ever.) It’s not enough to rule out the CMSSM, or the NUMH1, or even the MSSM. Similar statements apply for other speculative ideas that propose as yet unknown particles and forces; it’s not enough for the LHC to rule out just the simplest variants of these ideas.

Now if it turns out that supersymmetry is part of nature, rather few of my colleagues expect the variant we find to be contained within the CMSSM or NUHM1; and personally (though I’m probably in the minority) I have long doubted that it would be contained within the MSSM. Nevertheless, it is instructive to look at how LHC data is impacting the CMSSM and the NUHM1 subclasses of supersymmetry variants.   One just must be careful not to over-interpret; the exclusion of most variants in the CMSSM is not an indication that most variants of TeV-scale supersymmetry as a whole are excluded.

Now in this context, let’s see how the new measurement that was announced last week affects the CMSSM and the NUHM1. In Figure 1 is a plot showing the allowed variants of the CMSSM and the NUMH1, as a function of two quantities: on the horizontal axis, MA, which if large is (approximately but essentially) the mass of four of the five Higgs particles in the MSSM, and on the vertical axis, tan β, the ratio of the values of the two non-zero Higgs fields that are required in the MSSM. In solid red and solid blue are the one-standard-deviation and two-standard-deviation allowed regions after the new LHCb measurement is accounted for; any variant of the theory not sitting inside the blue region is excluded by the data. The dashed bands show the same thing before the new LHCb measurement. Since the dashed and solid blue bands are right on top of each other, you see there’s almost no effect at all. That’s what was behind my claim last Friday.

Fig. 1: Constraints on the CMSSM (left) and NUHM1 (right) subclasses of supersymmetric theories, before and after the HCP conference of last week. The quantities on the horizontal and vertical axes are explained in the text. In both plots: solid red (blue) give the constraints at one (two) standard deviations; variants outside the blue curve are excluded. Dashed red (blue) are the same limits before the new LHCb measurement. Notice there is almost no change.

But please, don’t misinterpret what I’m saying (or my colleagues) as suggesting that the LHC’s data has had no impact on the list of possible variants of supersymmetry! Far from it! Many variants are excluded, and many popular (but not necessarily more likely) subclasses of variants of supersymmetry have been pushed into regions that many would consider corners. The only statement in Figure 1 is that the new LHCb measurement didn’t make these corners smaller.  But to see how things have changed since before the LHC began, look at Figure 2, which shows how the LHC as a whole — all the measurements from LHCb, ATLAS and CMS taken together — have affected the CMSSM and NUMH1 since 2009. (The CMSSM and NUHM1 also make assumptions about where dark matter comes from, so even effects of the dark matter measurements from the XENON100 experiment are included here.)

Fig. 2: as in Figure 1, except that the dashed lines give the constraints on the CMSSM and NUHM1 before the LHC began taking data, and the solid line gives the constraints after the data taken through early summer 2011 was analyzed. Notice the scale on the horizontal axis is different from that of Figure 1.

Figure 2 is a similar plot to Figure 1 — but this time, solid blue and red indicate the impact of LHC data as of summer 2011, and the dashed blue and red indicate the situation before the LHC started. Now compare the dashed blue line in Figure 2 (before the LHC) with the solid blue line in Figure 1 (now); note the scale on the horizontal axis is different!. You’ll see that in the CMSSM it was possible before the LHC to have MA as low as 350 GeV/c², but now it has to be over 900 GeV/c², which many would consider a rather high value. In the NUHM1 there’s been a similar shift from 150 to about 300 GeV/c², not yet so high but still a significant increase. And meanwhile, while almost any value of tan β from 2 up to 60 was allowed before the LHC, this number is now limited to a smaller range. For example, if MA were below 900 GeV/c², then the CMSSM would be excluded and the NUHM1 would be allowed only for tan β < 30 or so.  This upper limit on tan β is mainly caused by the similar LHCb measurement presented back in March (and mentioned by me on Friday), and by similar ones from the CMS, CDF and ATLAS experiments.

But clearly there are plenty of variants within the NUHM1 that remain viable.  And the NUHM1 is not representative of the full range of possibilities within the MSSM, so even if the NUHM1 were excluded, we’d still have a long way to go to exclude the MSSM, much less all of TeV-scale supersymmetry. In short, it’s neither all nor nothing. Yes, a lot of progress has been made; LHC data (and data from other sources) have ruled out a lot of variants of TeV-scale supersymmetry.  But no, we’re not yet close to ruling out the full range of variants.

Please note that I’m not telling you this because I’m some devotee of supersymmetry who believes deeply in his heart that we’ll someday find it, and is trying to persuade you not to give up. I’m just laying out for you the facts on the ground. Do you imagine that I’m happy that a long, painful slog lies ahead, during which particle physicists — theorists and experimentalists — will painstakingly cover all the possible variants of supersymmetry, and slowly but surely determine whether or not supersymmetry is absent at the TeV scale? Don’t you think my life and that of my colleagues would be a lot easier if we could snap our fingers and with one or two quick measurements settle the question of whether supersymmetry is a fact of nature or not? Unfortunately, things don’t work that way.  You should simply ignore the irresponsible grand statements you will see in the press and on various blogs; indeed, sweeping remarks are a sign of careless thinking, and you should beware. The truth is that only through very hard work — by the experts who make the measurements, by those who advise them on which measurements to make, and by those who do the calculations that are the ingredients for studies like the MasterCode Project — can we hope to settle profound questions about nature.