Tag Archives: LHCb

LHCb experiment finds another case of CP violation in nature

The LHCb experiment at the Large Hadron Collider is dedicated mainly to the study of mesons [objects made from a quark of one type, an anti-quark of another type, plus many other particles] that contain bottom quarks (hence the `b’ in the name).  But it also can be used to study many other things, including mesons containing charm quarks.

By examining large numbers of mesons that contain a charm quark and an up anti-quark (or a charm anti-quark and an up quark) and studying carefully how they decay, the LHCb experimenters have discovered a new example of violations of the transformations known as CP (C: exchange of particle with anti-particle; P: reflection of the world in a mirror), of the sort that have been previously seen in mesons containing strange quarks and mesons containing bottom quarks.  Here’s the press release.

Congratulations to LHCb!  This important addition to our basic knowledge is consistent with expectations; CP violation of roughly this size is predicted by the formulas that make up the Standard Model of Particle Physics.  However, our predictions are very rough in this context; it is sometimes difficult to make accurate calculations when the strong nuclear force, which holds mesons (as well as protons and neutrons) together, is involved.  So this is a real coup for LHCb, but not a game-changer for particle physics.  Perhaps, sometime in the future, theorists will learn how to make predictions as precise as LHCb’s measurement!

A Hidden Gem At An Old Experiment?

This summer there was a blog post from   claiming that “The LHC `nightmare scenario’ has come true” — implying that the Large Hadron Collider [LHC] has found nothing but a Standard Model Higgs particle (the simplest possible type), and will find nothing more of great importance. With all due respect for the considerable intelligence and technical ability of the author of that post, I could not disagree more; not only are we not in a nightmare, it isn’t even night-time yet, and hardly time for sleep or even daydreaming. There’s a tremendous amount of work to do, and there may be many hidden discoveries yet to be made, lurking in existing LHC data.  Or elsewhere.

I can defend this claim (and have done so as recently as this month; here are my slides). But there’s evidence from another quarter that it is far too early for such pessimism.  It has appeared in a new paper (a preprint, so not yet peer-reviewed) by an experimentalist named Arno Heister, who is evaluating 20-year old data from the experiment known as ALEPH.

In the early 1990s the Large Electron-Positron (LEP) collider at CERN, in the same tunnel that now houses the LHC, produced nearly 4 million Z particles at the center of ALEPH; the Z’s decayed immediately into other particles, and ALEPH was used to observe those decays.  Of course the data was studied in great detail, and you might think there couldn’t possibly be anything still left to find in that data, after over 20 years. But a hidden gem wouldn’t surprise those of us who have worked in this subject for a long time — especially those of us who have worked on hidden valleys. (Hidden Valleys are theories with a set of new forces and low-mass particles, which, because they aren’t affected by the known forces excepting gravity, interact very weakly with the known particles.  They are also often called “dark sectors” if they have something to do with dark matter.)

For some reason most experimenters in particle physics don’t tend to look for things just because they can; they stick to signals that theorists have already predicted. Since hidden valleys only hit the market in a 2006 paper I wrote with then-student Kathryn Zurek, long after the experimenters at ALEPH had moved on to other experiments, nobody went back to look in ALEPH or other LEP data for hidden valley phenomena (with one exception.) I didn’t expect anyone to ever do so; it’s a lot of work to dig up and recommission old computer files.

This wouldn’t have been a problem if the big LHC experiments (ATLAS, CMS and LHCb) had looked extensively for the sorts of particles expected in hidden valleys. ATLAS and CMS especially have many advantages; for instance, the LHC has made over a hundred times more Z particles than LEP ever did. But despite specific proposals for what to look for (and a decade of pleading), only a few limited searches have been carried out, mostly for very long-lived particles, for particles with mass of a few GeV/c² or less, and for particles produced in unexpected Higgs decays. And that means that, yes, hidden physics could certainly still be found in old ALEPH data, and in other old experiments. Kudos to Dr. Heister for taking a look. Continue reading

Visiting the Host Lab of the Large Hadron Collider

Greetings from Geneva, and CERN, the laboratory that hosts the Large Hadron Collider [LHC], where the Higgs particle was found by the physicists at the ATLAS and CMS experiments. Between jet lag, preparing a talk for Wednesday, and talking to many experimental and theoretical particle physicists from morning til night, it will be a pretty exhausting week.

The initial purpose of this trip is to participate in a conference held by the LHCb experiment, entitled “Implications of LHCb measurements and future prospects.” Its goal is to bring theoretical particle physicists and LHCb experimenters together, to exchange information about what has been and what can be measured at LHCb.

On this website I’ve mostly written about ATLAS and CMS, partly because LHCb’s measurements are often quite subtle to explain, and partly because the Higgs particle search, the highlight of the early stage of the LHC, was really ATLAS’s and CMS’s task. But this week’s activities gives me a nice opportunity to put the focus on this very interesting experiment, which is quite different from ATLAS and CMS both in its design and in its goals, and to explain its important role.

ATLAS and CMS were built as general purpose detectors, whose first goal was to find the Higgs particle and whose second was to find (potentially rare) signs of any other high-energy processes that are not predicted by the Standard Model, the equations we use to describe all the known particles and forces of nature. Crudely speaking, ATLAS and CMS are ideal for looking for new phenomena in the 100 to 5000 GeV energy range (though we won’t reach the upper end of the range until 2015 and beyond.)

LHCb, by contrast, was built to study in great detail the bottom and charm quarks, and the hadrons (particles made from quarks, anti-quarks and gluons) that contain them. These quarks and their antiquarks are produced in enormous abundance at the LHC. They and the hadrons that contain them have masses in the 1.5 to 10 GeV/c² range… not much heavier than protons, and much lower than what ATLAS and CMS are geared to study. And this is why LHCb has been making crucial high-precision tests of the Standard Model using bottom- and charm-containing hadrons.  (Crucial, but not, despite repeated claims by the LHCb press office, capable of ruling out supersymmetry, which no single measurement can possibly do.)

Although this is the rough division of labor among these experiments, it’s too simplistic to describe the experiments this way. ATLAS and CMS can do quite a lot of physics at the low mass range, and in some measurements can compete well with LHCb.   Less well-known is that LHCb may be able to do a small but critical set of measurements involving higher energies than is their usual target.

LHCb is very different from ATLAS and CMS in many ways, and the most obvious is its shape. ATLAS and CMS look like giant barrels centered on the location of the proton-proton collisions, and are designed to measure as many particles as possible that are produced in the collision of two protons. LHCb’s shape is more like a wedge, with one end surrounding the collision point.

Left: Cut-away drawing of CMS, which is shaped like a barrel with proton-proton collisions occurring at its center.  ATLAS's shape is similar. Right: the LHCb experiment is shaped something like a wedge, with collisions occurring at one end.

Left: Cut-away drawing of CMS, which is shaped like a barrel with proton-proton collisions occurring at its center. ATLAS’s shape is similar. Right: Cut-away drawing of LHCb, which is shaped something like a wedge, with collisions occurring at one end.

This shape only allows it to measure those particle that go in the “forward” direction — close to the direction of one of the proton beams. (“Backward” would be near the other beam; the distinction between forward and backward is arbitrary, because the two proton beams have the same properties. “Central” would be far from either beam.) Unlike ATLAS and CMS, LHCb is not used to reconstruct the whole collision; many of the particles produced in the collision go into backward or central regions which LHCb can’t observe.  This has some disadvantages, and in particular put LHCb out of the running for the Higgs discovery. But a significant fraction of the bottom and charm quarks produced in proton-proton collisions go “forward” or “backward”, so a forward-looking design is fine if it’s bottom and charm quarks you’re interested in. And such a design is a lot cheaper, too. It also means that LHCb  is well positioned to make some other measurements where the forward direction is important. I’ll give you one or two examples later in the week.

To make their measurements of bottom and charm quarks, LHCb makes use of the fact that these quarks decay after about a trillionth of a second (a picosecond) [or longer if, as is commonly the case, there is significant time dilation due to Einstein’s relativity effects on very fast particles].  This is long enough for them to travel a measurable distance — typically a millimeter or more. LHCb is designed to make the measurements of charged particles with terrific precision, allowing them to infer a slight difference between the proton-proton collision point, from which most low-energy charged particles will emerge, and the location where some other charged particles may have been produced in the decay of a bottom hadron or some other particle that travels a millimeter or more before decaying. The ability to do precision “tracking” of the charged particles makes LHCb sensitive to the presence of any as-yet unknown particles that might be produced and then decay after traveling a small or moderate distance. More on that later in the week.

A computer reconstruction of the tracks in a proton-proton collision measured by LHCb.  Most tracks start at the proton-proton collision point, but the two tracks drawn in purple emerge from a different point, the apparent location of the decay of a hadron containing a bottom quark.

A computer reconstruction of the tracks in a proton-proton collision, as measured by LHCb. Most tracks start at the proton-proton collision point at left, but the two tracks drawn in purple emerge from a different point about 15 millimeters away, the apparent location of the decay of a hadron, whose inferred trajectory is the blue line, and whose mass (measured from the purple tracks) indicates that it contained a bottom quark.

One other thing to know about LHCb; in order to make their precise measurements possible, and to deal with the fact that they don’t observe a whole collision, they can’t afford to have too many collisions going on at once. ATLAS and CMS have been coping with ten to twenty simultaneous proton-proton collisions; this is part of what is known as “pile-up”. But near LHCb the LHC beams are adjusted so that the number of collisions at LHCb is often limited to just one or two or three simultaneous collisions. This has the downside that the amount of data LHCb collected in 2011 was about 1/5 of what ATLAS and CMS each collected, while for 2012 the number was more like 1/10.  But LHCb can do a number of things to make up for this lower rate; in particular their trigger system is more forgiving than that of ATLAS or CMS, so there are certain things they can measure using data of a sort that ATLAS and CMS have no choice but to throw away.

Did the LHC Just Rule Out String Theory?!

Over the weekend, someone said to me, breathlessly, that they’d read that “Results from the Large Hadron Collider [LHC] have blown string theory out of the water.”

Good Heavens! I replied. Who fed you that line of rubbish?!

Well, I’m not sure how this silliness got started, but it’s completely wrong. Just in case some of you or your friends have heard the same thing, let me explain why it’s wrong.

First, a distinction — one that is rarely made, especially by the more rabid bloggers, both those who are string lovers and those that are string haters. [Both types mystify me.] String theory has several applications, and you need to keep them straight. Let me mention two.

  1. Application number 1: this is the one you’ve heard about. String theory is a candidate (and only a candidate) for a “theory of everything” — a silly term, if you ask me, for what it really means is “a theory of all of nature’s particles, forces and space-time”. It’s not a theory of genetics or a theory of cooking or a theory of how to write a good blog post. But it’s still a pretty cool thing. This is the theory (i.e. a set of consistent equations and methods that describes relativistic quantum strings) that’s supposed to explain quantum gravity and all of particle physics, and if it succeeded, that would be fantastic.
  2. Application number 2: String theory can serve as a tool. You can use its mathematics, and/or the physical insights that you can gain by thinking about and calculating how strings behave, to solve or partially solve problems in other subjects. (Here’s an example.) These subjects include quantum field theory and advanced mathematics, and if you work in these areas, you may really not care much about application number 1. Even if application number 1 were ruled out by data, we’d still continue to use string theory as a tool. Consider this: if you grew up learning that a hammer was a religious idol to be worshipped, and later you decided you didn’t believe that anymore, would you throw out all your hammers? No. They’re still useful even if you don’t worship them.

BUT: today we are talking about Application Number 1: string theory as a candidate theory of all particles, etc. Continue reading

Some Weird Twists and Turns

In my last post, I promised you some comments on a couple of other news stories you may have seen.  Promise kept! see below.

But before I go there, I should mention (after questions from readers) an important distinction.  Wednesday’s post was about the simple process by which a Bs meson (a hadron containing a bottom quark and a down[typo] strange anti-quark, or vice versa, along with the usual crowd of gluons and quark/antiquark pairs) decays to a muon and an anti-muon.  The data currently shows nothing out of the ordinary there.  This is not to be confused with another story, loosely related but with crucially different details. There are some apparent discrepancies (as much as 3.7 standard deviations, but only 2.8 after accounting for the look-elsewhere effect) cropping up in details of the intricate process by which a Bd meson (a hadron containing a bottom quark and a down antiquark, or vice versa, plus the usual crowd) decays to a muon, an anti-muon, and a spin-one Kaon (a hadron containing a strange quark and a down anti-quark, or vice versa, plus the usual crowd). The measurements made by the LHCb experiment at the Large Hadron Collider disagree, in some but not all features, with the (technically difficult) predictions made using the Standard Model (the equations used to describe the known particles and forces.)

Don't confuse these two processes!  (Top) The process B_s --> muon + anti-muon, covered in Wednesday's post, agrees with Standard Model predictions.   (Bottom) The process B_d --> muon + anti-muon + K* is claimed to deviate by nearly 3 standard deviations from the Standard Model, but (as far as I am aware) the prediction and associated claim has not yet been verified by multiple groups of people, nor has the measurement been repeated.

Don’t confuse these two processes! (Top) The process B_s –> muon + anti-muon, covered in Wednesday’s post, agrees with Standard Model predictions. (Bottom) The process B_d –> muon + anti-muon + K* is claimed to deviate by nearly 3 standard deviations from the Standard Model, but (as far as I am aware) the prediction and associated claim has not yet been verified by multiple groups of people, nor has the measurement been repeated.

A few theorists have even gone so far as to claim this discrepancy is clearly a new phenomenon — the end of the Standard Model’s hegemony — and have gotten some press people to write (very poorly and inaccurately) about their claim.  Well, aside from the fact that every year we see several 3 standard deviation discrepancies turn out to be nothing, let’s remember to be cautious when a few scientists try to convince journalists before they’ve convinced their colleagues… (remember this example that went nowhere? …) And in this case we have them serving as judge and jury as well as press office: these same theorists did the calculation which disagrees with the data.  So maybe the Standard Model is wrong, or maybe their calculation is wrong.  In any case, you certainly musn’t believe the news article as currently written, because it has so many misleading statements and overstatements as to be completely beyond repair. [For one thing, it’s a case study in how to misuse the word “prove”.] I’ll try to get you the real story, but I have to study the data and the various Standard Model predictions more carefully first before I can do that with complete confidence.

Ok, back to the promised comments: on twists and turns for neutrinos and for muons…   Continue reading

A Couple of Rare Events

Did you know that another name for Minneapolis, Minnesota is “Snowmass”?  Just ask a large number of my colleagues, who are in the midst of a once-every-few-years exercise aimed at figuring out what should be the direction of the U.S. particle physics program.  I quote:

  • The American Physical Society’s Division of Particles and Fields is pursuing a long-term planning exercise for the high-energy physics community. Its goal is to develop the community’s long-term physics aspirations. Its narrative will communicate the opportunities for discovery in high-energy physics to the broader scientific community and to the government.

They are doing so in perhaps the worst of times, when political attacks on science are growing, government cuts to science research are severe, budgets to fund the research programs of particle physicists like me have been chopped by jaw-dropping amounts (think 25% or worse, from last year’s budget to this year’s — you can thank the sequester).. and all this at a moment when the data from the Large Hadron Collider and other experiments are not yet able to point us in an obvious direction for our future research program.  Intelligent particle physicists disagree on what to do next, there’s no easy way to come to consensus, and in any case Congress is likely to ignore anything we suggest.  But at least I hear Minneapolis is lovely in July and August!  This is the first Snowmass workshop that I have missed in a very long time, especially embarrassing since my Ph.D. thesis advisor is one of the conveners.  What can I say?  I wish my colleagues well…!

Meanwhile, I’d like to comment briefly on a few particle physics stories that you’ve perhaps seen in the press over recent days. I’ll cover one of them today — a measurement of a rare process which has now been officially “discovered”, though evidence for it was quite strong already last fall — and address a couple of others later in the week.  After that I’ll tell you about a couple of other stories that haven’t made the popular press… Continue reading

Higgs Workshop in Princeton

Today I’m attending the first day of a short workshop of particle theorists and experimentalists at the Princeton Center for Theoretical Science, a sort of “Where are we now and where are we going?” meeting. It’s entitled “Higgs Physics After Discovery”, but discussion will surely range more widely.

What, indeed, are the big questions facing particle physics in the short-term, meaning the next few months? Well, here are a few key ones:

  • A Higgs particle of some type has been discovered by the ATLAS and CMS experiments at the Large Hadron Collider [LHC] (with some contributions from the Tevatron experiments DZero and CDF); is it the simplest possible type of Higgs particle (the “Standard Model Higgs“) or is it more complex? What data analysis can be done on the LHC’s data from 2011-2012 to shed more light on this question?
  • More generally, from the LHC’s huge data set from 2011-2012 — specifically, from the data analysis that has been done so far — what precisely have we learned? (It’s increasingly important to go beyond the rougher estimates that were appropriate last year when the data was still pouring in.) What types of new phenomena have been excluded, and to what extent?
  • What other types of data analysis should be done on the 2011-2012 data, in order to look for other new phenomena that could still be lurking there? (There’s still a lot to be done on this question!) And what types of work should theoretical particle physicists do to help the experimentalists address this issue?
  • Several experiments from the Tevatron and the LHC, notably the LHCb experiment, have learned that newly measured decays of  certain mesons (hadrons with equal numbers of quarks and anti-quarks) that contain heavy quarks are roughly consistent with the Standard Model (the equations we use to describe the known elementary particles and forces, and a simplest type of Higgs field and Higgs particle.) How do these findings constrain the possibility of other new phenomena?
  • Looking ahead to 2015, when the LHC will begin running again at a higher energy per proton-proton collision, what preparations need to be made? Especially, what needs to be done to refine the triggering systems at ATLAS, CMS and LHCb, so that the maximum information can be extracted from the new data, and no important information is unnecessarily discarded?
  • Which, if any, of the multiple (but mostly mutually inconsistent) experimental hints of dark matter should be taken seriously? Which possibilities do the various dark matter experiments, and the LHC’s data, actually exclude or favor?

That might be it for the very near term. There are lots of other questions in the medium- to long-term, among which is the big question of what types of experiments should be done over the next 10 – 20 years. One challenge is that the LHC’s data hasn’t yet given us a clear target other than the Higgs particle itself. An obvious possible experiment to do is to study the Higgs in more detail, using an electron/anti-electron collider — historically this has been a successful strategy that has been used on almost every new apparently-elementary particle. But there are a lot of other possibilities, including raising the LHC’s collisions to even higher energy than we’ll see in 2015, using more powerful magnets currently under development.

If there are other near-term questions I’ve forgotten about, I’m sure I’ll be reminded at the workshop, and I’ll add them in.

Conclusion of the Higgs Symposium

By almost all measures, the Higgs Symposium at the University of Edinburgh, as part of the new Higgs Centre for Theoretical Physics, was a great success.  The only negative was that Professor Peter Higgs himself had a bad cold this week, and had to cancel his talk, as well as missing the majority of the talks by others.  Obviously all of us in attendance were very disappointed not to hear directly from him, and we wish him a speedy recovery.

Other than this big hole in the schedule, the talks given at the symposium seemed to me to form a coherent summary of where we are right now in our understanding of the Higgs field and particle.  They were full of interesting material, and wonderfully complementary to one another.  This motivates me to try to provide, for non-experts, some future articles on what the conference attendees had to say.  But to write such articles well takes time.  So for now, here’s the quick version summarizing the last few talks, along the lines of the summaries I wrote (here and here) of the earlier talks.  The slides from all the talks are posted here.

Here we go: Continue reading