Category Archives: Other Collider News

An Interesting Result from CMS, and its Implications

UPDATE 10/26: In the original version of this post, I stupidly forgot to include an effect, causing an error of a factor of about 5 in one of my estimates below. I had originally suggested that a recent result using ALEPH data was probably more powerful than a recent CMS result.  But once the error is corrected, the two experiments appear have comparable sensitivity. However, I was very conservative in my analysis of ALEPH, and my guess concerning CMS has a big uncertainty band — so it might go either way.  It’s up to ALEPH experts and CMS experts to show us who really wins the day.  Added reasoning and discussion marked in green below.

In Friday’s post, I highlighted the importance of looking for low-mass particles whose interactions with known particles are very weak. I referred to a recent preprint in which an experimental physicist, Dr. Arno Heister, reanalyzed ALEPH data in such a search.

A few hours later, Harvard Professor Matt Reece pointed me to a paper that appeared just two weeks ago: a very interesting CMS analysis of 2011-2012 data that did a search of this type — although it appears that CMS [one of the two general purpose detectors at the Large Hadron Collider (LHC)] didn’t think of it that way.

The title of the paper is obscure:  “Search for a light pseudo–scalar Higgs boson produced in association with bottom quarks in pp collisions at 8 TeV“.  Such spin-zero “pseudo-scalar” particles, which often arise in speculative models with more than one Higgs particle, usually decay to bottom quark/anti-quark pairs or tau/anti-tau pairs.  But they can have a very rare decay to muon/anti-muon, which is much easier to measure. The title of the paper gives no indication that the muon/anti-muon channel is the target of the search; you have to read the abstract. Shouldn’t the words “in the dimuon channel” or “dimuon resonance” appear in the title?  That would help researchers who are interested in dimuons, but not in pseudo-scalars, find the paper.

Here’s the main result of the paper:

At left is shown a plot of the number of events as a function of the invariant mass of the muon/anti-muon pairs.  CMS data is in black dots; estimated background is shown in the upper curve (with top quark backgrounds in the lower curve); and the peak at bottom shows what a simulated particle decaying to muon/anti-muon with a mass of 30 GeV/c² would look like. (Imagine sticking the peak on top of the upper curve to see how a signal would affect the data points).  At right are the resulting limits on the rate for such a resonance to be produced and then decay to muon/anti-muon, if it is radiated off of a bottom quark. [A limit of 100 femtobarns means that at most two thousand collisions of this type could have occurred during the year 2012.  But note that only about 1 in 100 of these collisions would have been observed, due to the difficulty of triggering on these collisions and some other challenges.]

[Note also the restriction of the mass of the dimuon pair to the range 25 GeV to 60 GeV. This may have done purely been for technical reasons, but if it was due to the theoretical assumptions, that restriction should be lifted.]

While this plot places moderate limits on spin-zero particles produced with a bottom quark, it’s equally interesting, at least to me, in other contexts. Specifically, it puts limits on any light spin-one particle (call it V) that mixes (either via kinetic or mass mixing) with the photon and Z and often comes along with at least one bottom quark… because for such particles the rate to decay to muons is not rare.  This is very interesting for hidden valley models specifically; as I mentioned on Friday, new spin-one and spin-zero particles often are produced together, giving a muon/anti-muon pair along with one or more bottom quark/anti-quark pairs.

But CMS interpreted its measurement only in terms of radiation of a new particle off a bottom quark.  Now, what if a V particle decaying sometimes to muon/anti-muon were produced in a Z particle decay (a possibility alluded to already in 2006).  For a different production process, the angles and energies of the particles would be different, and since many events would be lost (due to triggering, transverse momentum cuts, and b-tagging inefficiencies at low transverse momentum) the limits would have to be fully recalculated by the experimenters.  It would be great if CMS could add such an analysis before they publish this paper.

Still, we can make a rough back-of-the-envelope estimate, with big caveats. The LHC produced about 600 million Z particles at CMS in 2012. The plot at right tells us that if the V were radiated off a bottom quark, the maximum number of produced V’s decaying to muons would be about 2000 to 8000, depending on the V mass.  Now if we could take those numbers directly, we’d conclude that the fraction of Z’s that could decay to muon/anti-muon plus bottom quarks in this way would be 3 to 12 per million. But sensitivity of this search to a Z decay to V is probably much less than for a V radiated off bottom quarks [because (depending on the V mass) either the bottom quarks in the Z decay would be less energetic and more difficult to tag, or the muons are less energetic on average, or both.] So I’m guessing that the limits on Z decays to V are always worse than one per hundred thousand, for any V mass.  (Thanks to Wei Xue for catching an error as I was finalizing my estimate.)  

If that guess/estimate is correct, then the CMS search does not rule out the possibility of a hundred or so Z decays to V particles at each of the various LEP experiments.  That said, old LEP searches might rule this possibility out; if anyone knows of such a search, please comment or contact me.

As for whether Heister’s analysis of the ALEPH experiment’s data shows signs of such a signal, I think it unlikely (though some people seemed to read my post as saying the opposite.)  As I pointed out in Friday’s post, not only is the excess too small for excitement on its own, it also is somewhat too wide and its angular correlations look like the background (which comes, of course, from bottom quarks that decay to charm quarks plus a muon and neutrino.)  The point of Friday’s post, and of today’s, is that we should be looking.

In fact, because of Heister’s work (which, by the way, is his own, not endorsed by the ALEPH collaboration), we can draw interesting if rough conclusions.  Ignore for now the bump at 30 GeV/c²; that’s more controversial.  What about the absence of a bump between 35 and 50 GeV/c²? Unless there are subtleties with his analysis that I don’t understand, we learn that at ALEPH there were fewer than ten Z decays to a V particle (plus a source of bottom quarks) for V in this mass range.  That limits such Z decays to about 2 to 3 per million.  OOPS: Dumb mistake!! At this step, I forgot to include the fact that requiring bottom quarks in the ALEPH events only works about 20% of the time (thanks to Imperial College Professor Oliver Buchmuller for questioning my reasoning!) The real number is therefore about 5 times larger, more like 10 to 15 per million. If that rough estimate is correct, it would provide a more powerful constraint than constraint roughly comparable to the current CMS analysis.

[[BUT: In my original argument I was very conservative.  When I said “fewer than 10”, I was trying to be brief; really, looking at the invariant mass plot, the allowed numbers of excess events for a V with mass above 36 GeV is typically fewer than 7 or even 5.  And that doesn’t include any angular information, which for many signals would reduce the numbers to 3.   Including these effects properly brings the ALEPH bound back down to something close to my initial estimate.  Anyway, it’s clear that CMS is nipping at ALEPH’s heels, but I’m still betting they haven’t passed ALEPH — yet.]]

So my advice would be to set Heister’s bump aside and instead focus on the constraints that one can obtain, and the potential discoveries that one could make, with this type of analysis, either at LEP or at LHC. That’s where I think the real lesson lies.

A Hidden Gem At An Old Experiment?

This summer there was a blog post from   claiming that “The LHC `nightmare scenario’ has come true” — implying that the Large Hadron Collider [LHC] has found nothing but a Standard Model Higgs particle (the simplest possible type), and will find nothing more of great importance. With all due respect for the considerable intelligence and technical ability of the author of that post, I could not disagree more; not only are we not in a nightmare, it isn’t even night-time yet, and hardly time for sleep or even daydreaming. There’s a tremendous amount of work to do, and there may be many hidden discoveries yet to be made, lurking in existing LHC data.  Or elsewhere.

I can defend this claim (and have done so as recently as this month; here are my slides). But there’s evidence from another quarter that it is far too early for such pessimism.  It has appeared in a new paper (a preprint, so not yet peer-reviewed) by an experimentalist named Arno Heister, who is evaluating 20-year old data from the experiment known as ALEPH.

In the early 1990s the Large Electron-Positron (LEP) collider at CERN, in the same tunnel that now houses the LHC, produced nearly 4 million Z particles at the center of ALEPH; the Z’s decayed immediately into other particles, and ALEPH was used to observe those decays.  Of course the data was studied in great detail, and you might think there couldn’t possibly be anything still left to find in that data, after over 20 years. But a hidden gem wouldn’t surprise those of us who have worked in this subject for a long time — especially those of us who have worked on hidden valleys. (Hidden Valleys are theories with a set of new forces and low-mass particles, which, because they aren’t affected by the known forces excepting gravity, interact very weakly with the known particles.  They are also often called “dark sectors” if they have something to do with dark matter.)

For some reason most experimenters in particle physics don’t tend to look for things just because they can; they stick to signals that theorists have already predicted. Since hidden valleys only hit the market in a 2006 paper I wrote with then-student Kathryn Zurek, long after the experimenters at ALEPH had moved on to other experiments, nobody went back to look in ALEPH or other LEP data for hidden valley phenomena (with one exception.) I didn’t expect anyone to ever do so; it’s a lot of work to dig up and recommission old computer files.

This wouldn’t have been a problem if the big LHC experiments (ATLAS, CMS and LHCb) had looked extensively for the sorts of particles expected in hidden valleys. ATLAS and CMS especially have many advantages; for instance, the LHC has made over a hundred times more Z particles than LEP ever did. But despite specific proposals for what to look for (and a decade of pleading), only a few limited searches have been carried out, mostly for very long-lived particles, for particles with mass of a few GeV/c² or less, and for particles produced in unexpected Higgs decays. And that means that, yes, hidden physics could certainly still be found in old ALEPH data, and in other old experiments. Kudos to Dr. Heister for taking a look. Continue reading

What if the Large Hadron Collider Finds Nothing Else?

In my last post, I expressed the view that a particle accelerator with proton-proton collisions of (roughly) 100 TeV of energy, significantly more powerful than the currently operational Large Hadron Collider [LHC] that helped scientists discover the Higgs particle, is an obvious and important next steps in our process of learning about the elementary workings of nature. And I described how we don’t yet know whether it will be an exploratory machine or a machine with a clear scientific target; it will depend on what the LHC does or does not discover over the coming few years.

What will it mean, for the 100 TeV collider project and more generally, if the LHC, having made possible the discovery of the Higgs particle, provides us with no more clues?  Specifically, over the next few years, hundreds of tests of the Standard Model (the equations that govern the known particles and forces) will be carried out in measurements made by the ATLAS, CMS and LHCb experiments at the LHC. Suppose that, as it has so far, the Standard Model passes every test that the experiments carry out? In particular, suppose the Higgs particle discovered in 2012 appears, after a few more years of intensive study, to be, as far the LHC can reveal, a Standard Model Higgs — the simplest possible type of Higgs particle?

Before we go any further, let’s keep in mind that we already know that the Standard Model isn’t all there is to nature. The Standard Model does not provide a consistent theory of gravity, nor does it explain neutrino masses, dark matter or “dark energy” (also known as the cosmological constant). Moreover, many of its features are just things we have to accept without explanation, such as the strengths of the forces, the existence of “three generations” (i.e., that there are two heavier cousins of the electron, two for the up quark and two for the down quark), the values of the masses of the various particles, etc. However, even though the Standard Model has its limitations, it is possible that everything that can actually be measured at the LHC — which cannot measure neutrino masses or directly observe dark matter or dark energy — will be well-described by the Standard Model. What if this is the case?

Michelson and Morley, and What They Discovered

In science, giving strong evidence that something isn’t there can be as important as discovering something that is there — and it’s often harder to do, because you have to thoroughly exclude all possibilities. [It’s very hard to show that your lost keys are nowhere in the house — you have to convince yourself that you looked everywhere.] A famous example is the case of Albert Michelson, in his two experiments (one in 1881, a second with Edward Morley in 1887) trying to detect the “ether wind”.

Light had been shown to be a wave in the 1800s; and like all waves known at the time, it was assumed to be a wave in something material, just as sound waves are waves in air, and ocean waves are waves in water. This material was termed the “luminiferous ether”. As we can detect our motion through air or through water in various ways, it seemed that it should be possible to detect our motion through the ether, specifically by looking for the possibility that light traveling in different directions travels at slightly different speeds.  This is what Michelson and Morley were trying to do: detect the movement of the Earth through the luminiferous ether.

Both of Michelson’s measurements failed to detect any ether wind, and did so expertly and convincingly. And for the convincing method that he invented — an experimental device called an interferometer, which had many other uses too — Michelson won the Nobel Prize in 1907. Meanwhile the failure to detect the ether drove both FitzGerald and Lorentz to consider radical new ideas about how matter might be deformed as it moves through the ether. Although these ideas weren’t right, they were important steps that Einstein was able to re-purpose, even more radically, in his 1905 equations of special relativity.

In Michelson’s case, the failure to discover the ether was itself a discovery, recognized only in retrospect: a discovery that the ether did not exist. (Or, if you’d like to say that it does exist, which some people do, then what was discovered is that the ether is utterly unlike any normal material substance in which waves are observed; no matter how fast or in what direction you are moving relative to me, both of us are at rest relative to the ether.) So one must not be too quick to assume that a lack of discovery is actually a step backwards; it may actually be a huge step forward.

Epicycles or a Revolution?

There were various attempts to make sense of Michelson and Morley’s experiment.   Some interpretations involved  tweaks of the notion of the ether.  Tweaks of this type, in which some original idea (here, the ether) is retained, but adjusted somehow to explain the data, are often referred to as “epicycles” by scientists.   (This is analogous to the way an epicycle was used by Ptolemy to explain the complex motions of the planets in the sky, in order to retain an earth-centered universe; the sun-centered solar system requires no such epicycles.) A tweak of this sort could have been the right direction to explain Michelson and Morley’s data, but as it turned out, it was not. Instead, the non-detection of the ether wind required something more dramatic — for it turned out that waves of light, though at first glance very similar to other types of waves, were in fact extraordinarily different. There simply was no ether wind for Michelson and Morley to detect.

If the LHC discovers nothing beyond the Standard Model, we will face what I see as a similar mystery.  As I explained here, the Standard Model, with no other particles added to it, is a consistent but extraordinarily “unnatural” (i.e. extremely non-generic) example of a quantum field theory.  This is a big deal. Just as nineteenth-century physicists deeply understood both the theory of waves and many specific examples of waves in nature  and had excellent reasons to expect a detectable ether, twenty-first century physicists understand quantum field theory and naturalness both from the theoretical point of view and from many examples in nature, and have very good reasons to expect particle physics to be described by a natural theory.  (Our examples come both from condensed matter physics [e.g. metals, magnets, fluids, etc.] and from particle physics [e.g. the physics of hadrons].) Extremely unnatural systems — that is, physical systems described by quantum field theories that are highly non-generic — simply have not previously turned up in nature… which is just as we would expect from our theoretical understanding.

[Experts: As I emphasized in my Santa Barbara talk last week, appealing to anthropic arguments about the hierarchy between gravity and the other forces does not allow you to escape from the naturalness problem.]

So what might it mean if an unnatural quantum field theory describes all of the measurements at the LHC? It may mean that our understanding of particle physics requires an epicyclic change — a tweak.  The implications of a tweak would potentially be minor. A tweak might only require us to keep doing what we’re doing, exploring in the same direction but a little further, working a little harder — i.e. to keep colliding protons together, but go up in collision energy a bit more, from the LHC to the 100 TeV collider. For instance, perhaps the Standard Model is supplemented by additional particles that, rather than having masses that put them within reach of the LHC, as would inevitably be the case in a natural extension of the Standard Model (here’s an example), are just a little bit heavier than expected. In this case the world would be somewhat unnatural, but not too much, perhaps through some relatively minor accident of nature; and a 100 TeV collider would have enough energy per collision to discover and reveal the nature of these particles.

Or perhaps a tweak is entirely the wrong idea, and instead our understanding is fundamentally amiss. Perhaps another Einstein will be needed to radically reshape the way we think about what we know.  A dramatic rethink is both more exciting and more disturbing. It was an intellectual challenge for 19th century physicists to imagine, from the result of the Michelson-Morley experiment, that key clues to its explanation would be found in seeking violations of Newton’s equations for how energy and momentum depend on velocity. (The first experiments on this issue were carried out in 1901, but definitive experiments took another 15 years.) It was an even greater challenge to envision that the already-known unexplained shift in the orbit of Mercury would also be related to the Michelson-Morley (non)-discovery, as Einstein, in trying to adjust Newton’s gravity to make it consistent with the theory of special relativity, showed in 1913.

My point is that the experiments that were needed to properly interpret Michelson-Morley’s result

  • did not involve trying to detect motion through the ether,
  • did not involve building even more powerful and accurate interferometers,
  • and were not immediately obvious to the practitioners in 1888.

This should give us pause. We might, if we continue as we are, be heading in the wrong direction.

Difficult as it is to do, we have to take seriously the possibility that if (and remember this is still a very big “if”) the LHC finds only what is predicted by the Standard Model, the reason may involve a significant reorganization of our knowledge, perhaps even as great as relativity’s re-making of our concepts of space and time. Were that the case, it is possible that higher-energy colliders would tell us nothing, and give us no clues at all. An exploratory 100 TeV collider is not guaranteed to reveal secrets of nature, any more than a better version of Michelson-Morley’s interferometer would have been guaranteed to do so. It may be that a completely different direction of exploration, including directions that currently would seem silly or pointless, will be necessary.

This is not to say that a 100 TeV collider isn’t needed!  It might be that all we need is a tweak of our current understanding, and then such a machine is exactly what we need, and will be the only way to resolve the current mysteries.  Or it might be that the 100 TeV machine is just what we need to learn something revolutionary.  But we also need to be looking for other lines of investigation, perhaps ones that today would sound unrelated to particle physics, or even unrelated to any known fundamental question about nature.

Let me provide one example from recent history — one which did not lead to a discovery, but still illustrates that this is not all about 19th century history.

An Example

One of the great contributions to science of Nima Arkani-Hamed, Savas Dimopoulos and Gia Dvali was to observe (in a 1998 paper I’ll refer to as ADD, after the authors’ initials) that no one had ever excluded the possibility that we, and all the particles from which we’re made, can move around freely in three spatial dimensions, but are stuck (as it were) as though to the corner edge of a thin rod — a rod as much as one millimeter wide, into which only gravitational fields (but not, for example, electric fields or magnetic fields) may penetrate.  Moreover, they emphasized that the presence of these extra dimensions might explain why gravity is so much weaker than the other known forces.

Fig. 1: ADD's paper pointed out that no experiment as of 1998 could yet rule out the possibility that our familiar three dimensional world is a corner of a five-dimensional world, where the two extra dimensions are finite but perhaps as large as a millimeter.

Fig. 1: ADD’s paper pointed out that no experiment as of 1998 could yet rule out the possibility that our familiar three-dimensional world is a corner of a five-dimensional world, where the two extra dimensions are finite but perhaps as large as a millimeter.

Given the incredible number of experiments over the past two centuries that have probed distances vastly smaller than a millimeter, the claim that there could exist millimeter-sized unknown dimensions was amazing, and came as a tremendous shock — certainly to me. At first, I simply didn’t believe that the ADD paper could be right.  But it was.

One of the most important immediate effects of the ADD paper was to generate a strong motivation for a new class of experiments that could be done, rather inexpensively, on the top of a table. If the world were as they imagined it might be, then Newton’s (and Einstein’s) law for gravity, which states that the force between two stationary objects depends on the distance r between them as 1/r², would increase faster than this at distances shorter than the width of the rod in Figure 1.  This is illustrated in Figure 2.

Fig. 2: If the world were as sketched in Figure 1, then Newton/Einstein's law of gravity would be violated at distances shorter than the width of the rod in Figure 1.  The blue line shows Newton/Einstein's prediction; the red line shows what a universe like that in Figure 1 would predict instead.  Experiments done in the last few years agree with the blue curve down to a small fraction of a millimeter.

Fig. 2: If the world were as sketched in Figure 1, then Newton/Einstein’s law of gravity would be violated at distances shorter than the width of the rod in Figure 1. The blue line shows Newton/Einstein’s prediction; the red line shows what a universe like that in Figure 1 would predict instead. Experiments done in the last few years agree with the blue curve down to a small fraction of a millimeter.

These experiments are not easy — gravity is very, very weak compared to electrical forces, and lots of electrical effects can show up at very short distances and have to be cleverly avoided. But some of the best experimentalists in the world figured out how to do it (see here and here). After the experiments were done, Newton/Einstein’s law was verified down to a few hundredths of a millimeter.  If we live on the corner of a rod, as in Figure 1, it’s much, much smaller than a millimeter in width.

But it could have been true. And if it had, it might not have been discovered by a huge particle accelerator. It might have been discovered in these small inexpensive experiments that could have been performed years earlier. The experiments weren’t carried out earlier mainly because no one had pointed out quite how important they could be.

Ok Fine; What Other Experiments Should We Do?

So what are the non-obvious experiments we should be doing now or in the near future?  Well, if I had a really good suggestion for a new class of experiments, I would tell you — or rather, I would write about it in a scientific paper. (Actually, I do know of an important class of measurements, and I have written a scientific paper about them; but these are measurements to be done at the LHC, and don’t involve a entirely new experiment.)  Although I’m thinking about these things, I do not yet have any good ideas.  Until I do, or someone else does, this is all just talk — and talk does not impress physicists.

Indeed, you might object that my remarks in this post have been almost without content, and possibly without merit.  I agree with that objection.

Still, I have some reasons for making these points. In part, I want to highlight, for a wide audience, the possible historic importance of what might now be happening in particle physics. And I especially want to draw the attention of young people. There have been experts in my field who have written that non-discoveries at the LHC constitute a “nightmare scenario” for particle physics… that there might be nothing for particle physicists to do for a long time. But I want to point out that on the contrary, not only may it not be a nightmare, it might actually represent an extraordinary opportunity. Not discovering the ether opened people’s minds, and eventually opened the door for Einstein to walk through. And if the LHC shows us that particle physics is not described by a natural quantum field theory, it may, similarly, open the door for a young person to show us that our understanding of quantum field theory and naturalness, while as intelligent and sensible and precise as the 19th century understanding of waves, does not apply unaltered to particle physics, and must be significantly revised.

Of course the LHC is still a young machine, and it may still permit additional major discoveries, rendering everything I’ve said here moot. But young people entering the field, or soon to enter it, should not assume that the experts necessarily understand where the field’s future lies. Like FitzGerald and Lorentz, even the most brilliant and creative among us might be suffering from our own hard-won and well-established assumptions, and we might soon need the vision of a brilliant young genius — perhaps a theorist with a clever set of equations, or perhaps an experimentalist with a clever new question and a clever measurement to answer it — to set us straight, and put us onto the right path.

A 100 TeV Proton-Proton Collider?

During the gap between the first run of the Large Hadron Collider [LHC], which ended in 2012 and included the discovery of the Higgs particle (and the exclusion of quite a few other things), and its second run, which starts a year from now, there’s been a lot of talk about the future direction for particle physics. By far the most prominent option, both in China and in Europe, involves the long-term possibility of a (roughly) 100 TeV proton-proton collider — that is, a particle accelerator like the LHC, but with 5 to 15 times more energy per collision.

Do we need such a machine? Continue reading

It’s (not) The End of the World

The December solstice has come and gone at 11:11 a.m. London time (6:11 a.m New York time). That’s the moment when the north pole of the Earth points most away from the sun, and the south pole points most toward it. Because it’s followed by a weekend and then Christmas Eve, it marks the end of the 2012 blogging season, barring a major event between now and year’s end. But although 11:11 London time is the only moment of astronomical significance during this day (clearly the universe does not care where humans set our international date line and exactly how we set our time zones, so destruction was never going to be at local midnight — something the media doesn’t seem to get) it obviously wasn’t the end of the world.

A lot of people do put a lot of stock in prophecy, including prophecies of the end of the world that nobody ever made (such as the one not made for today by the Mayans, through their calendar) and others that people made but were wrong (such as those made by Harold Camping last year and by many throughout history who preceded him.) If anyone were any good at prophecy they’d be able to use their special knowledge to become billionaires, so maybe we should be watching Bill Gates and Michael Bloomberg and the Koch brothers and people like that. I haven’t heard any rumors of them building bunkers or spaceships yet. Of course at the end of the year they may get a small tax hike, but that wouldn’t be the end of the world.

The Large Hadron Collider [LHC], meanwhile, has triumphantly reached the end of its first run of proton-proton collisions. Goal #1 of the LHC was to allow physicists at the ATLAS and CMS experiments to discover the Higgs particle, or particles, or whatever took their place in nature; and it would appear that, in a smashing success, they have co-discovered one.  But no Higgs particles, or anything like them, will be produced again until 2015. Although the LHC will run for a short while in early 2013, it will do so in a different mode, smashing not protons but the nuclei of lead atoms together, in order to study the properties of extremely hot and dense matter, under conditions the universe hasn’t seen since the earliest stages of the Big Bang that launched the current era of our universe.  Then it will be closed down for repairs and upgrades.  So until 2015, any additional information we’re going to learn about the Higgs particle, or any other unknown particle that might have been produced at the LHC, is going to be obtained by analyzing the data that has been collected in 2011 and 2012. The total amount of data is huge; what was collected in 2012 was about 4.5 times as much as in 2011, and it was taken at 8 TeV of energy per proton-proton collision rather than 7 TeV as in 2011. I can assure you there will be many new things learned from analyzing that data throughout 2013 and 2014.

Of course a lot of people prophesied confidently that we’d discover supersymmetry, or something else dramatic, very early on at the LHC. Boy, were they wrong! Those of us who were cautioning against such optimistic statements are not sure whether to laugh or cry, because of course it would have been great to have such a discovery early in the LHC program. But there was ample reason to believe (despite what other bloggers sometimes say) that even if supersymmetry exists and is accessible to the LHC experiments, discovering it could take a lot longer than just two years!  For instance, see this paper written in 2006 pointing out that the search strategies being planned for seeking supersymmetry might fail in the presence of a few extra lightweight particles not predicted in the minimal variants of supersymmetry. As far as I can tell at present, this very big loophole has only partly been closed by the LHC studies done up to now. The same loophole applies for other speculative ideas, including certain variants of LHC-accessible extra dimensions. I am hopeful that these loopholes can be closed in 2013 and 2014, with additional analysis on the current data, but until they are, you should be very cautious believing those who claim that reasonable variants of LHC-accessible supersymmetry (meaning “natural variants of supersymmetry that resolve the hierarchy problem”) are ruled out by the LHC experiments. It’s just not true. Not yet. The only classes of theories that have been almost thoroughly ruled out by LHC data are those predict on general grounds that there should be no observable Higgs particle at all (e.g. classic technicolor).

While we’re on the subject, I’ve been looking back at how I did on prophecy this year. It’s been a remarkably good year, probably my best ever — though admittedly I only made very easy (though not necessarily common) predictions. First, the really easy one:  I assured you, as did most of my colleagues, that 2012 would be the Year of the Higgs — at least, the Year of the Simplest Possible Higgs particle, called the “Standard Model Higgs”. It would be the year when Phase 1 of the Higgs Search would end — when we’d either find a Higgs particle of Standard Model type (or something looking vaguely like it), or, if not, we’d know we’d have to move to a more aggressive search in Phase 2, in which we’d look for more complicated versions of the Higgs particle that would have been much harder to find. We started the year with ambiguous hints of the Higgs particle, too flimsy to be sure of, but certainly tantalizing, at around a mass of 125 GeV/c2. In July the hints turned into a discovery — somewhat faster than expected for a Standard Model Higgs particle, because the rate for this particle to appear in collisions that produce two photons was higher than anticipated. The excess in the photon signal means either the probability for the Higgs particle to decay to photons is larger than predicted for a Higgs of Standard Model type, or both CMS and ATLAS experienced a fortunate statistical fluctuation that made the discovery easier. We still don’t know which it was; though we’ll know more by March, this ambiguity may remain with us until 2015.

One prophecy I made all the way back at the beginning of this blog, July 2011, was that the earliest search strategy for the Higgs, through its decays to a lepton, anti-lepton, neutrino and anti-neutrino, wouldn’t end up being crucial in the discovery; it was just too difficult. (In this experimental context, “lepton” refers only to “electron” or “muon”; taus don’t count, for technical reasons.) In the end, I said, it would be decays of the Higgs to two photons and to two lepton/anti-lepton pairs that would be the critical ones, because they would provide a clean signal that would be uncontroversial. And that prophesy was correct; the photon-based and lepton-based searches were the signals that led to discovery.

Now we’ve reached December, and the data seems to imply that except possibly for this overabundance of photons, which still tantalizes us, the various measurements of how the Higgs-like particle is produced and decays are starting to agree, to a precision which is still only moderate, with the predictions of the Standard Model for a Higgs of this mass. Fewer and fewer experts are still suggesting that this is not a Higgs particle. But it will be some years yet — 2018 or later — before measurements are precise enough to start convincing people that this Higgs particle is really of Standard Model type. Many variants of the Standard Model, with new particles and forces, predict that the difference of the real Higgs from a Standard Model Higgs may be subtle, with deviations at the ten percent level or even less. Meanwhile, other Higgs-like particles, with different masses and different properties, might be hiding in the data, and it may take quite a while to track them down. Many years of data collecting and data analysis lie ahead, in Phase 2 of the Higgs search.

Another prophecy I made at the beginning of the year was that Exotic Decays of the Higgs would be a high priority for 2012. You might think this prophesy was wrong, because in fact, so far, there have been very few searches at ATLAS, CMS and LHCb for such decays. But the challenge that required prioritizing these decays wasn’t data analysis; it was the problem of even collecting the data. The problem is that many exotic decays of the Higgs would lead to events that might not be selected by the all-important trigger system that determines which tiny fraction of the LHC’s collisions to store permanently for analysis! At the beginning of 2012 there was a risk that some of these processes would have been dumped by the trigger and irretrievably lost from the 2012 data, making future searches for such decays impossible or greatly degraded. At a hadron collider like the LHC, you have to think ahead! If you don’t consider carefully the analyses you’ll want to do a year or two from now, you may not set the trigger properly today. So although the priority for data analysis in 2012 was to find the Higgs particle and measure its bread-and-butter properties, the fact that the Higgs has come out looking more or less Standard Model-like in 2012 means that focusing on exotic possibilities, including exotic decays, will be one of the obvious places to look for something new, and thus a very high priority for data analysis, in 2013 and 2014. And that’s why, for the trigger — for the collection of the data — exotic decays were a very high priority for 2012. Indeed, one significant use of the new strategy of delayed data streaming at ATLAS and of data parking at CMS (two names for the same thing) was to address this priority. [My participation in this effort, working with experimentalists and with several young theorists, was my most rewarding project of 2012.]  As I explained to you, a Higgs particle with a low mass, such as 125 GeV/c2, is very sensitive to the presence of new particles and forces that are otherwise very difficult to detect, and it easily could exhibit one or more types of exotic decays.  So there will be a lot of effort put into looking for signs of exotic decays in 2013 and 2014! I’m very excited about all the work that lies ahead of us.

Now, the prophecy I’d like to make, but cannot — because I do not have any special insight into the answer — is on the question of whether the LHC will make great new discoveries in the future, or whether the LHC has already made its last discovery: a Higgs particle of Standard Model type. Even if the latter is the case, we will need years of data from the LHC in order to distinguish these two possibilities; there’s no way for us to guess. It’s clear that Nature’s holding secrets from us.  We know the Standard Model (the equations we use to describe all the known particles and forces) is not a complete theory of nature, because it doesn’t explain things like dark matter (hey, were dark matter particles perhaps discovered in 2012?), and it doesn’t tell us why, for example, there are six types of quarks, or why the heaviest quark has a mass that is more than 10,000 times larger than the mass of the lightest quarks, etc. What we don’t know is whether the answers to those secrets are accessible to the LHC; does it have enough energy per collision, and enough collisions, for the job?  The only way to find out is to run the LHC, and to dig thoroughly through its data for any sign of anything amiss with the predictions of the Standard Model. This is very hard work, and it will take the rest of the decade (but not until the end of the world.)

In the meantime, please do not fret about the quiet in the tunnel outside Geneva, Switzerland. The LHC will be back, bigger and better (well, at least with more energy per collision) in 2015. And while we wait during the two year shutdown, the experimentalists at ATLAS, CMS, and LHCb will be hard at work, producing many new results from the 2011 and 2012 proton collision data! Even the experiments CDF and DZero from the terminated Tevatron are still writing new papers. In short, fear not: not only isn’t the December solstice of 2012 the end of the world, it doesn’t even signal a temporary stop to the news about the Higgs particle!

—-

One last personal note (just for those with some interest in my future.)

End of the OPERA Story

In case you haven’t yet heard (check my previous post from this morning), neutrinos traveling 730 kilometers from the CERN laboratory to the Gran Sasso laboratory do arrive at the time Einstein’s special relativity predicts they would.

Of course (as the press mostly seems to forget) we knew that.  We knew it because

So the news from the Neutrino 2012 conference in Kyoto, on new data from May 2012 taken by OPERA and three nearby experiments, is no surprise to anyone who was paying attention back in March and early April; it’s exactly what we were expecting.

One thing that almost no one is reporting, as far as I can tell, is that CERN’s research director Sergio Bertolucci did not give the first talk on neutrino speeds in Kyoto.  That talk was given by Marcos Dracos, of OPERA.  Dracos presented both OPERA’s corrected 2011 results (with corrections based on the detailed investigation shown in March of the problems reported back in February) and also the new 2012 results, which were taken with a kind of short-pulse beams similar to that used in OPERA-2.  (A short pulse beam allows for a neutrino speed measurement to be made rather easily and quickly, at the expense of OPERA’s neutrino oscillation studies, which were the main purpose of building the OPERA experiment.)

Following Dracos’ talk, Bertolucci spoke next, and reported the results of the neighboring Borexino, LVD and ICARUS experiments on the May 2012 data, which along with OPERA are all bathed in the same CERN-to-Gran Sasso neutrino beam, and collected their data simultaneously.  All of the results are preliminary so the numbers below will change in detail.  But they are not going to change very much.  Here they are: neutrinos arrive at a time that differs from expectation by:

  • Borexino: δt = 2.7 ± 1.2 (stat) ± 3 (sys) ns
  • ICARUS: δt = 5.1 ± 1.1 (stat) ± 5.5 (sys) ns
  • LVD: δt = 2.9 ± 0.6 (stat) ± 3 (sys) ns
  • OPERA: δt = 1.6 ± 1.1 (stat) [+ 6.1, -3.7] (sys) ns

(Here “ns” means nanoseconds, and “stat” and “sys” mean statistical and systematic uncertainty.)  The original OPERA result was an early arrival of about 60 nanoseconds, about six standard deviations away from expectations.  You see that all the experiments are consistent with zero early/late arrival to about 1 standard deviation — almost too consistent, in fact, for four experiments.

So there is no longer any hint of any evidence whatsoever of a problem with the predictions of special relativity, and in particular with the existence of a universal speed limit.

A summing up is called for, but I want to write that carefully.  So unless something else comes up, that’s all for today.

Guess What?! Neutrinos Travel Just Below the Speed of Light

Five out of five experiments agree: neutrinos do not travel faster than the speed limit.

Or more precisely: to within the uncertainties of current measurements, neutrino speed, for neutrinos with energies far larger than their masses, is experimentally indistinguishable from the speed of light in vacuum.  This is just as expected in standard Einsteinian special relativity, which would predict they move just below light speed, by an amount too small to measure with current experiments.

http://press.web.cern.ch/press/PressReleases/Releases2011/PR19.11E.html

Based on data taken in May 2012 using a beam of neutrinos sent from the CERN laboratory to the Gran Sasso lab, the four experiments ICARUS, LVD, Borexino and even OPERA (the source of  all the excitement) find results consistent with the speed of light, with uncertainties (at one-standard-deviation) about 10 times smaller than OPERA’s original measured deviation of neutrino speed from the speed of light.  The new results are consistent with ICARUS’s result from 2011 data.  Moreover, OPERA’s mistaken result from September and November 2011 — a claimed six standard deviations away from the expected speed — has now been corrected, following their detective work presented in March.  Even MINOS, a U.S. experiment, has revised their older result, which was previously slightly discrepant from the speed of light by a small amount (two standard deviations), and they find now that their data too are quite consistent with neutrinos traveling with light speed, though with much less precision in the measurement.

And so with a final quintet, sung in unison, this melodramatic comic OPERA buffa comes to a close.  As with all classic operatic comedies, there’s been crisis, chaos, and a good bit of hilarity, all the while with wise voices speaking reason to no avail, but in the end the overzealous are chastened, the righteous are made whole, everyone (even the miscreant) is happy, and all is well with the world.

Curtain!! Applause!!  Science Triumphant!!

Favorable review to follow when time permits.

A Violation of Lepton Universality?

A brief mention today of a new measurement from the BABAR experimental collaboration, which tests lepton universality (to be explained below) and finds it wanting.

The Basic Story (Slightly Oversimplified)

Within the Standard Model of particle physics (the equations that describe and predict the behavior of all of the known particles and forces), the effects of the weak nuclear force on the three leptons — the electron, the muon and the tau — are all expected to be identical. This called “lepton universality”. Continue reading