Category Archives: LHC News

Visiting the Host Lab of the Large Hadron Collider

Greetings from Geneva, and CERN, the laboratory that hosts the Large Hadron Collider [LHC], where the Higgs particle was found by the physicists at the ATLAS and CMS experiments. Between jet lag, preparing a talk for Wednesday, and talking to many experimental and theoretical particle physicists from morning til night, it will be a pretty exhausting week.

The initial purpose of this trip is to participate in a conference held by the LHCb experiment, entitled “Implications of LHCb measurements and future prospects.” Its goal is to bring theoretical particle physicists and LHCb experimenters together, to exchange information about what has been and what can be measured at LHCb.

On this website I’ve mostly written about ATLAS and CMS, partly because LHCb’s measurements are often quite subtle to explain, and partly because the Higgs particle search, the highlight of the early stage of the LHC, was really ATLAS’s and CMS’s task. But this week’s activities gives me a nice opportunity to put the focus on this very interesting experiment, which is quite different from ATLAS and CMS both in its design and in its goals, and to explain its important role.

ATLAS and CMS were built as general purpose detectors, whose first goal was to find the Higgs particle and whose second was to find (potentially rare) signs of any other high-energy processes that are not predicted by the Standard Model, the equations we use to describe all the known particles and forces of nature. Crudely speaking, ATLAS and CMS are ideal for looking for new phenomena in the 100 to 5000 GeV energy range (though we won’t reach the upper end of the range until 2015 and beyond.)

LHCb, by contrast, was built to study in great detail the bottom and charm quarks, and the hadrons (particles made from quarks, anti-quarks and gluons) that contain them. These quarks and their antiquarks are produced in enormous abundance at the LHC. They and the hadrons that contain them have masses in the 1.5 to 10 GeV/c² range… not much heavier than protons, and much lower than what ATLAS and CMS are geared to study. And this is why LHCb has been making crucial high-precision tests of the Standard Model using bottom- and charm-containing hadrons.  (Crucial, but not, despite repeated claims by the LHCb press office, capable of ruling out supersymmetry, which no single measurement can possibly do.)

Although this is the rough division of labor among these experiments, it’s too simplistic to describe the experiments this way. ATLAS and CMS can do quite a lot of physics at the low mass range, and in some measurements can compete well with LHCb.   Less well-known is that LHCb may be able to do a small but critical set of measurements involving higher energies than is their usual target.

LHCb is very different from ATLAS and CMS in many ways, and the most obvious is its shape. ATLAS and CMS look like giant barrels centered on the location of the proton-proton collisions, and are designed to measure as many particles as possible that are produced in the collision of two protons. LHCb’s shape is more like a wedge, with one end surrounding the collision point.

Left: Cut-away drawing of CMS, which is shaped like a barrel with proton-proton collisions occurring at its center.  ATLAS's shape is similar. Right: the LHCb experiment is shaped something like a wedge, with collisions occurring at one end.

Left: Cut-away drawing of CMS, which is shaped like a barrel with proton-proton collisions occurring at its center. ATLAS’s shape is similar. Right: Cut-away drawing of LHCb, which is shaped something like a wedge, with collisions occurring at one end.

This shape only allows it to measure those particle that go in the “forward” direction — close to the direction of one of the proton beams. (“Backward” would be near the other beam; the distinction between forward and backward is arbitrary, because the two proton beams have the same properties. “Central” would be far from either beam.) Unlike ATLAS and CMS, LHCb is not used to reconstruct the whole collision; many of the particles produced in the collision go into backward or central regions which LHCb can’t observe.  This has some disadvantages, and in particular put LHCb out of the running for the Higgs discovery. But a significant fraction of the bottom and charm quarks produced in proton-proton collisions go “forward” or “backward”, so a forward-looking design is fine if it’s bottom and charm quarks you’re interested in. And such a design is a lot cheaper, too. It also means that LHCb  is well positioned to make some other measurements where the forward direction is important. I’ll give you one or two examples later in the week.

To make their measurements of bottom and charm quarks, LHCb makes use of the fact that these quarks decay after about a trillionth of a second (a picosecond) [or longer if, as is commonly the case, there is significant time dilation due to Einstein's relativity effects on very fast particles].  This is long enough for them to travel a measurable distance — typically a millimeter or more. LHCb is designed to make the measurements of charged particles with terrific precision, allowing them to infer a slight difference between the proton-proton collision point, from which most low-energy charged particles will emerge, and the location where some other charged particles may have been produced in the decay of a bottom hadron or some other particle that travels a millimeter or more before decaying. The ability to do precision “tracking” of the charged particles makes LHCb sensitive to the presence of any as-yet unknown particles that might be produced and then decay after traveling a small or moderate distance. More on that later in the week.

A computer reconstruction of the tracks in a proton-proton collision measured by LHCb.  Most tracks start at the proton-proton collision point, but the two tracks drawn in purple emerge from a different point, the apparent location of the decay of a hadron containing a bottom quark.

A computer reconstruction of the tracks in a proton-proton collision, as measured by LHCb. Most tracks start at the proton-proton collision point at left, but the two tracks drawn in purple emerge from a different point about 15 millimeters away, the apparent location of the decay of a hadron, whose inferred trajectory is the blue line, and whose mass (measured from the purple tracks) indicates that it contained a bottom quark.

One other thing to know about LHCb; in order to make their precise measurements possible, and to deal with the fact that they don’t observe a whole collision, they can’t afford to have too many collisions going on at once. ATLAS and CMS have been coping with ten to twenty simultaneous proton-proton collisions; this is part of what is known as “pile-up”. But near LHCb the LHC beams are adjusted so that the number of collisions at LHCb is often limited to just one or two or three simultaneous collisions. This has the downside that the amount of data LHCb collected in 2011 was about 1/5 of what ATLAS and CMS each collected, while for 2012 the number was more like 1/10.  But LHCb can do a number of things to make up for this lower rate; in particular their trigger system is more forgiving than that of ATLAS or CMS, so there are certain things they can measure using data of a sort that ATLAS and CMS have no choice but to throw away.

Did the LHC Just Rule Out String Theory?!

Over the weekend, someone said to me, breathlessly, that they’d read that “Results from the Large Hadron Collider [LHC] have blown string theory out of the water.”

Good Heavens! I replied. Who fed you that line of rubbish?!

Well, I’m not sure how this silliness got started, but it’s completely wrong. Just in case some of you or your friends have heard the same thing, let me explain why it’s wrong.

First, a distinction — one that is rarely made, especially by the more rabid bloggers, both those who are string lovers and those that are string haters. [Both types mystify me.] String theory has several applications, and you need to keep them straight. Let me mention two.

  1. Application number 1: this is the one you’ve heard about. String theory is a candidate (and only a candidate) for a “theory of everything” — a silly term, if you ask me, for what it really means is “a theory of all of nature’s particles, forces and space-time”. It’s not a theory of genetics or a theory of cooking or a theory of how to write a good blog post. But it’s still a pretty cool thing. This is the theory (i.e. a set of consistent equations and methods that describes relativistic quantum strings) that’s supposed to explain quantum gravity and all of particle physics, and if it succeeded, that would be fantastic.
  2. Application number 2: String theory can serve as a tool. You can use its mathematics, and/or the physical insights that you can gain by thinking about and calculating how strings behave, to solve or partially solve problems in other subjects. (Here’s an example.) These subjects include quantum field theory and advanced mathematics, and if you work in these areas, you may really not care much about application number 1. Even if application number 1 were ruled out by data, we’d still continue to use string theory as a tool. Consider this: if you grew up learning that a hammer was a religious idol to be worshipped, and later you decided you didn’t believe that anymore, would you throw out all your hammers? No. They’re still useful even if you don’t worship them.

BUT: today we are talking about Application Number 1: string theory as a candidate theory of all particles, etc. Continue reading

A Discrepancy to Keep an Eye On

Today (as I sit in a waiting room for jury service) I’ll draw your attention to something that has been quite rare at the Large Hadron Collider [LHC]: a notable discrepancy between prediction and data.  (Too rare, in fact — when you make so many measurements, some of them should be discrepant; the one place we saw plenty of examples was in the search for and initial study of the Higgs particle.)  It’s not big enough to declare as a definite challenge to the Standard Model (the equations we use to describe the known particles and forces), but it’s one we’ll need to be watching… and you can bet there will be dozens of papers trying to suggest possibilities for what this discrepancy, if it is real, might be due to.

The discrepancy has arisen in the search at the CMS experiment for “multileptons”: for proton-proton collisions in which at least three charged leptons — electrons, muons and (to a degree) taus — were produced. Such events are a good place to look for new phenomena: very rare in the Standard Model, but in the context of some speculative ideas (including the possibility of additional types of Higgs particles, or of superpartner particles from supersymmetry, or new light neutral particles that decay sometimes to lepton/anti-lepton pairs, etc.) they can be produced in the decays of some unknown type of particle. Continue reading

Final Day of SEARCH 2013

Day 3 of the SEARCH workshop (see here for an introduction and overviews of Day 1 and Day 2) opened with my own talk, entitled “On The Frontier: Where New Physics May Be Hiding”. The issue I was addressing is this:

Even though dozens of different strategies have been used by the experimenters at ATLAS and CMS (the two general purpose experiments at the Large Hadron Collider [LHC]) to look for various types of new particles, there are still many questions that haven’t been asked and many aspects of the data that haven’t been studied. My goal was to point out a few of these unasked or incompletely asked questions, ones that I think are very important for ATLAS and CMS experts to investigate… both in the existing data and also in the data that the LHC will start producing, with a higher energy per proton-proton collision, in 2015.

I covered four topics — I’ll be a bit long-winded here, so just skip over this part if it bores you.

1. Non-Standard-Model (or “exotic”) Higgs Decays: a lightweight Higgs particle, such as the one we’ve recently discovered, is very sensitive to novel effects, and can reveal them by decaying in unexpected ways. One class of possibilities, studied by a very wide range of theorists over the past decade, is that the Higgs might decay to unknown lightweight particles (possibly related in some way to dark matter). I’ve written about these possible Higgs decays a lot (here, here, here, here, here, here and here). This was a big topic of mine at the last SEARCH workshop, and is related to the issue of data parking/delaying. In recent months, a bunch of young theorists (with some limited help and advice from me) have been working to write an overview article, going systematically through the most promising non-Standard-Model decay modes of the Higgs, and studying how easy or difficult it will be to measure them.  Discoveries using the 2011-2012 data are certainly possible!  and at least at CMS, the parked data is going to play an important role.

2. What Variants of “Natural” Supersymmetry (And Related Models) Are Still Allowed By ATLAS and CMS Searches? A natural variant of supersymmetry (see my discussion of “naturalness”=genericity here) is one in which the Higgs particle’s mass and the Higgs field’s value (and therefore the W and Z particles’ masses) wouldn’t change drastically if you were somehow to vary the masses of superpartner particles by small amounts. Such variants tend to have the superpartner particle of the Higgs (called the “Higgsino”) relatively light (a few hundred GeV/c² or below), the superpartner of the top (the “top squark”, with which the Higgs interacts very strongly) also relatively light, and the superpartner of the gluino up in the 1-2 TeV range. If the gluino is heavier than 1.4 TeV or so, then it is too heavy to have been produced during the 2011-2012 LHC run; for variants with such a heavy gluino, we may have to wait until 2015 and beyond to discover or rule them out. But it turns out that if the gluino is light enough (generally a bit above 1 TeV/c²) it is possible to make very general arguments, without resort to the three assumptions that go into the most classic searches for supersymmetry, that almost all such natural and currently accessible variants are now ruled out. I say “almost” because there is at least one class of important exceptions where the case is clearly not yet closed, and for which the gluino mass could be well below 1 TeV/c². [Research to completely characterize the situation is still in progress; I'm working on it with Rutgers faculty member David Shih and postdocs Yevgeny Kats and Jared Evans.]  What we’ve learned is applicable beyond supersymmetry to certain other classes of speculative ideas.

3. Long-Lived Particles: In most LHC studies, it is assumed that any currently unknown particles that are produced in LHC collisions will decay in microscopic times to particles we know about. But it is also possible that one or more new type of particle will decay only after traveling a measurable distance (about 1 millimeter or greater) from the collision point. Searching for such “long-lived” particles (with lifetimes longer than a trillionth of a second!) is complicated; there are many cases to consider, a non-standard search strategy is almost always required, and sometimes specialized trigger strategies are needed. Until recently, only a few studies had been carried out, many with only 2011 data. A very important advance occurred very recently, however, when CMS produced a study, using the full 2011-2012 data set, looking for a long-lived particle that decays to two jets (or to anything that looks to the detector like two jets, which is a bit more general) after traveling up to a large fraction of a meter. The specialized trigger that was used requires about 300 GeV of energy or more to be produced in the proton-proton collision in the form of jets (or things that look like jets to the triggering system.) This is too much for the search to detect a Higgs particle decaying to one or two long-lived particles, because a Higgs particle’s mass-energy [E=mc2 energy] is only 125 GeV, and it is rather rare therefore for 300 GeV of energy in jets-et-al to be observed when a Higgs is produced. But in many speculative theories with long-lived particles, this amount of energy is easily obtained. As a result, this new CMS search clearly wipes out, at one stroke, many variants of a number of speculative models. It will take theorists a little while to fully understand the impact of this new search, but it will be big. Still, it’s by no means the final word.  We need to push harder, improving and broadening the use of these methods, in order that decays of the Higgs itself to long-lived particles can be searched for. This has been done already in a handful of cases (for example if the long-lived particle decays not to jets but to a muon/anti-muon pair or an electron/positron pair, or if the long-lived particle travels several meters before it decays) and in some cases it is already possible to show that at most 1 in 100 to 1000 Higgs particles produce long-lived particles of this type.  For some other cases, the triggers developed for the parked data may be crucial.

4. “Soft” Signals: A frontier that has never been explored, but which theorists have been talking about for some years, is one in which a high-energy process associated with a new particle is typically accompanied by an unusually large number of very low-energy particles (typically photons or hadrons with energy below a few GeV). The high-energy process is mimicked by certain common processes that occur in the Standard Model, and consequently the signal is drowned out, like a child’s voice in a crowded room. But the haze of a large number of low-energy particles that accompanies the signal is rare in the mimicking processes, so by keeping only those collisions that show something like this haze, it becomes possible to throw out the mimicking process most of the time, making the signal stand out — as though, in trying to find the child, one could identify a way to get most of the people to leave the room, reducing the noise enough for the child’s voice to be heard. [For experts: The most classic example of this situation arises in certain types of objects called "quirks", though perhaps there are other examples. For non-experts: I'll explain what quirks are some other time; it's a sophisticated story.]

I was pleased that there was lively discussion on all of these four points; that’s essential for a good workshop.

After me there were talks by ATLAS expert Erez Etzion and CMS’s Steve Wurm, surveying a large number of searches for new particles and other phenomena by the two experiments. One new result that particularly caught my eye was a set of CMS searches for new very heavy particles that decay to pairs of W and/or Z particles.  The W and Z particles go flying outwards with tremendous energy, and form the kind of jet-like objects I mentioned yesterday in the context of Jesse Thaler’s talk on “jet substructure”.  This and a couple of other related measurements are reflective of our moving into a new era, in which detection of jet-like W and Z particles and jet-like top quarks has become part of the standard toolbox of a particle physicist.

The workshop concluded with three hour-long panel discussions:

  1. on the possible interplay between dark matter and LHC research (for instance: how production of “friends” of dark matter [i.e., particles that are somehow related to dark matter particles] may be easier to detect at the LHC than production of dark matter itself)
  2. on the highest priorities for the 2013-2014 shutdown period before the LHC restarts (for instance, conversations between theorists and experimentalists about the trigger strategies that should be used in the next LHC run)
  3. on what the opportunities of the 2015-2020 run of the LHC are likely to be, and what their implications may be (for instance, the ability to finally reach the 3 TeV/c2 mass range for the types of particles one would expect in the so-called “Randall-Sundrum” class of extra-dimensions models; the opportunities to look for very rare Higgs, top and W decays; and the potential to complete the program I outlined above of ruling out all but a very small class of natural variants of supersymmetry.)

All in all, a useful workshop — but its true value will depend on how much we all follow up on what we discussed.

SEARCH Day 2

Day 2 of the SEARCH workshop will get a shorter description than it deserves, because I’ve had to spend time finishing my own talk for this morning. But there were a lot of nice talks, so let me at least tell you what they were about.

Both ATLAS and CMS presented their latest results on searches for supersymmetry. (I should remind you that “searches for supersymmetry” are by no means actually limited to supersymmetry — they can be used to discover or exclude many other new particles and forces that have nothing to do with supersymmetry at all.) Speakers Pascal Pralavorio and Sanjay Padhi gave very useful overviews of the dozens of searches that have been done so far as part of this effort, including a few rather new results that are very powerful. (We should see even more appear at next week’s Supersymmetry conference.) My short summary: almost everything easy has been done thoroughly; many challenging searches have also been carried out; if superpartner particles are present, they’re either

  • so heavy that they aren’t produced very often (e.g. gluinos)
  • rather lightweight, but still not so often produced (e.g. top squarks, charginos, neutralinos, sleptons)
  • produced often, but decaying in some way that is very hard to detect (e.g. gluinos decaying only to quarks, anti-quarks and gluons)

Then we had a few talks by theorists. Patrick Meade talked about how unknown particles that are affected by weak nuclear and electromagnetic forces, but not by strong nuclear forces, could give signs that are hiding underneath processes that occur in the Standard Model. (Examples of such particles are the neutralinos and charginos or sleptons of supersymmetry.) To find them requires increased precision in our calculations and in our measurements of processes where pairs of W and/or Z and/or Higgs particles are produced. As a definite example, Meade noted that the rate for producing pairs of W particles disagrees somewhat from current predictions based on the Standard Model, and emphasized that this small disagreement could be due to new particles (such as top squarks, or sleptons, or charginos and neutralinos) although at this point there’s no way to know.

Matt Reece gave an analogous talk about spin-zero quark-like particles that do feel strong nuclear forces, the classic example of which are top squarks. Again, the presence of these particles can be hidden underneath the large signals from production of top quark/anti-quark pairs, or other common processes. ATLAS and CMS have been working hard to look for signals of these types of particles, and have made a lot of progress, but there are still quite a few possible signals that haven’t been searched for yet. Among other things, Reece discussed some methods invented by theorists that might be useful in contributing to this effort. As with the previous talk, the key to a complete search will be improvements in calculations and measurements of top quark production, and of other processes that involve known particles.

After lunch there was a more general discussion about looking for supersymmetry, including conversation about what variants of supersymmetry haven’t yet been excluded by existing ATLAS and CMS searches.  (I had a few things to say about that in my talk, but more on that tomorrow.)

Jesse Thaler gave a talk reviewing the enormous progress that has been made in understanding how to distinguish ordinary jets arising from quarks and gluons versus jet-like objects made from a single high-energy W, Z, Higgs or top quark that decays to quarks and anti-quarks. (The jargon is that the trick is to use “jet substructure” — the fact that inside a jet-like W are two sub-jets, each from a quark or anti-quark.) At SEARCH 2012, the experimenters showed very promising though preliminary results using a number of new jet substructure methods that had been invented by (mostly) theorists. By now, the experimenters have shown definitively that these methods work — and will continue to work as the rate of collisions at the LHC grows — and have made a number of novel measurements using them. Learning how to use jet substructure is one of the great success stories of the LHC era, and it will continue to be a major story in coming years.

Two talks by ATLAS (Leandro Nisanti) and CMS (Matt Hearndon) followed, each with a long list of careful measurements of what the Standard Model is doing, mostly based so far only on the 2011 data set (and not yet including last year’s data). These measurements are crucially important for multiple reasons:

  • They provide important information which can serve as input to other measurements and searches.
  • They may reveal subtle problems with the Standard Model, due to indirect or small effects from unknown particles or forces.
  • Confirming that measurements of certain processes agree with theoretical predictions gives us confidence that those predictions can be used in other contexts, in particular in searches for unknown particles and forces.

Most, but not all, theoretical predictions for these careful measurements have worked well. Those that aren’t working so well are of course being watched and investigated carefully — but there aren’t any discrepancies large enough to get excited about yet (other than the top quark forward-backward asymmetry puzzle, which wasn’t discussed much today). In general, the Standard Model works beautifully — so far.

The day concluded with a panel discussion focused on these Standard Model measurements. Key questions discussed included: how do we use LHC data to understand the structure of the proton more precisely, and how in turn does that affect our searches for unknown phenomena? In particular, a major concern is the risk of circularity; that a phenomenon from an unknown type of particle could produce a subtle effect that we would fail to recognize for what it is, instead misinterpreting it as a small misunderstanding of proton structure, or as a small problem with a theoretical calculation. Such are the challenges of making increasingly precise measurements, and searching for increasingly rare phenomena, in the complicated environment of the LHC.

SEARCHing for New Particles on Long Island

Greetings from Stony Brook’s Simon’s Center, and the SEARCH 2013 workshop. (I reported on the SEARCH 2012 workshop here, here, here and here.) Over the next three days, a small group (about 50) of theoretical particle physicists and experimentalists from ATLAS and CMS (two of the experiments at the Large Hadron Collider [LHC]) will be discussing the latest results from the LHC, and brainstorming about what else should be done with the existing LHC data and with future data.

The workshop was organized by three theorists, Raman Sundrum, professor at Maryland (who has opened the day with a characteristically brilliant and inspirational talk about the status of the field and the purpose of the workshop), Patrick Meade, professor at Stony Brook, and Michele Papucci, soon-to-be professor at Michigan.

Of course we’ll be discussing the newly discovered Higgs particle — that discussion will occupy most of today — but we’ll be also looking at many other types of particles, forces and other phenomena that nature might be hiding from us, and how we would be able to uncover them if they exist. There’ve been many dozens of searches done at both ATLAS and CMS, but the experimentalists certainly haven’t had time to try everything plausible — and theorists haven’t yet thought of everything they might try. Workshops like this are aimed at making sure no stones are left unturned in the existing huge pile of data from 2011-2012, and also that we’re fully prepared to deal with the new data, from higher-energy proton-proton collisions, that will start pouring in starting in 2015.

Some Weird Twists and Turns

In my last post, I promised you some comments on a couple of other news stories you may have seen.  Promise kept! see below.

But before I go there, I should mention (after questions from readers) an important distinction.  Wednesday’s post was about the simple process by which a Bs meson (a hadron containing a bottom quark and a down[typo] strange anti-quark, or vice versa, along with the usual crowd of gluons and quark/antiquark pairs) decays to a muon and an anti-muon.  The data currently shows nothing out of the ordinary there.  This is not to be confused with another story, loosely related but with crucially different details. There are some apparent discrepancies (as much as 3.7 standard deviations, but only 2.8 after accounting for the look-elsewhere effect) cropping up in details of the intricate process by which a Bd meson (a hadron containing a bottom quark and a down antiquark, or vice versa, plus the usual crowd) decays to a muon, an anti-muon, and a spin-one Kaon (a hadron containing a strange quark and a down anti-quark, or vice versa, plus the usual crowd). The measurements made by the LHCb experiment at the Large Hadron Collider disagree, in some but not all features, with the (technically difficult) predictions made using the Standard Model (the equations used to describe the known particles and forces.)

Don't confuse these two processes!  (Top) The process B_s --> muon + anti-muon, covered in Wednesday's post, agrees with Standard Model predictions.   (Bottom) The process B_d --> muon + anti-muon + K* is claimed to deviate by nearly 3 standard deviations from the Standard Model, but (as far as I am aware) the prediction and associated claim has not yet been verified by multiple groups of people, nor has the measurement been repeated.

Don’t confuse these two processes! (Top) The process B_s –> muon + anti-muon, covered in Wednesday’s post, agrees with Standard Model predictions. (Bottom) The process B_d –> muon + anti-muon + K* is claimed to deviate by nearly 3 standard deviations from the Standard Model, but (as far as I am aware) the prediction and associated claim has not yet been verified by multiple groups of people, nor has the measurement been repeated.

A few theorists have even gone so far as to claim this discrepancy is clearly a new phenomenon — the end of the Standard Model’s hegemony — and have gotten some press people to write (very poorly and inaccurately) about their claim.  Well, aside from the fact that every year we see several 3 standard deviation discrepancies turn out to be nothing, let’s remember to be cautious when a few scientists try to convince journalists before they’ve convinced their colleagues… (remember this example that went nowhere? …) And in this case we have them serving as judge and jury as well as press office: these same theorists did the calculation which disagrees with the data.  So maybe the Standard Model is wrong, or maybe their calculation is wrong.  In any case, you certainly musn’t believe the news article as currently written, because it has so many misleading statements and overstatements as to be completely beyond repair. [For one thing, it's a case study in how to misuse the word "prove".] I’ll try to get you the real story, but I have to study the data and the various Standard Model predictions more carefully first before I can do that with complete confidence.

Ok, back to the promised comments: on twists and turns for neutrinos and for muons…   Continue reading

A Couple of Rare Events

Did you know that another name for Minneapolis, Minnesota is “Snowmass”?  Just ask a large number of my colleagues, who are in the midst of a once-every-few-years exercise aimed at figuring out what should be the direction of the U.S. particle physics program.  I quote:

  • The American Physical Society’s Division of Particles and Fields is pursuing a long-term planning exercise for the high-energy physics community. Its goal is to develop the community’s long-term physics aspirations. Its narrative will communicate the opportunities for discovery in high-energy physics to the broader scientific community and to the government.

They are doing so in perhaps the worst of times, when political attacks on science are growing, government cuts to science research are severe, budgets to fund the research programs of particle physicists like me have been chopped by jaw-dropping amounts (think 25% or worse, from last year’s budget to this year’s — you can thank the sequester).. and all this at a moment when the data from the Large Hadron Collider and other experiments are not yet able to point us in an obvious direction for our future research program.  Intelligent particle physicists disagree on what to do next, there’s no easy way to come to consensus, and in any case Congress is likely to ignore anything we suggest.  But at least I hear Minneapolis is lovely in July and August!  This is the first Snowmass workshop that I have missed in a very long time, especially embarrassing since my Ph.D. thesis advisor is one of the conveners.  What can I say?  I wish my colleagues well…!

Meanwhile, I’d like to comment briefly on a few particle physics stories that you’ve perhaps seen in the press over recent days. I’ll cover one of them today — a measurement of a rare process which has now been officially “discovered”, though evidence for it was quite strong already last fall — and address a couple of others later in the week.  After that I’ll tell you about a couple of other stories that haven’t made the popular press… Continue reading