Supersymmetry is one of the most popular of the speculative ideas that theorists have proposed to understand the puzzle known as the hierarchy problem. It has many wonderful features, ranging from mathematical beauty to its potential ability to explain other puzzles in particle physics, such as the nature of dark matter. It also has a number of uncomfortable features too, so you shouldn’t get the impression that it’s all roses. But still, supersymmetry is a very, very good idea. In this article I’ll review the main search for supersymmetry at the LHC, and describe you what we know (and don’t know) so far.
Supersymmetry predicts that for each type of known particle (and Higgs particle), there must be another particle, its superpartner, that shares many of the particle’s features. But because supersymmetry is not exact, and is instead hidden from view (“broke”), the superpartners are all rather heavy. And as there are many ways this hiding (“breaking”) can occur, supersymmetry in a realistic form does not predict what the superparticle masses are. So in principle the experimentalists have to consider all possible variants of supersymmetry, where in each variant the superparticles have different masses. Fortunately, they don’t have to look everywhere. First, we know the superpartners aren’t very light, or we would have found them already. And second, if supersymmetry is in fact the solution of the hierarchy problem, the superpartners of the known particles must not be too much heavier than 1 TeV = 1000 GeV! Keep this number in mind, for later reference!
In the new data from the first half of 2011, scientists with the ATLAS and CMS experiments have mainly been seeking supersymmetry using a technique detailed in this earlier article on the standard way to look for supersymmetry at the LHC. To summarize that article, the experiments look for a surprising number of proton-proton collisions which show
- signs of multiple quarks, antiquarks, or gluons carrying very high energy (and appearing in a detector as high-energy jets, or sprays of particles)
- signs of undetected particles recoiling against these jets (an effect often mis-termed “missing energy”, for historical reasons.)
They may also require that the events have charged leptons and/or anti-leptons — 1, 2, or more than 2.
An example of what they are looking for in their detectors is sketched in Fig. 1; see this article, Figs. 2–5. You see two jets heading up and to the right, while nothing visible recoils against them. A signal of supersymmetry might involve finding more events of this type in the data than were expected from known processes.
This isn’t easy, because many collisions look just like Fig. 1 but have nothing to do with supersymmetry. The experimentalists are looking for a small signal within a very large background. (As discussed in some detail in this earlier article [on current hints of the Higgs particle]: A “signal” is the thing you’re looking for. “Background” is everything else that resembles your signal and makes it difficult for you to find it. )
An example of a background that would look similar to the signal in Fig. 1 would be a collision that makes high-energy quarks, anti-quarks or gluons, along with a Z particle that decays to a neutrino and an antineutrino, both of which pass unscathed through the detector; an example is shown on the right of Fig. 2. This and the signal process on the left both produce two jets recoiling against something undetectable, and show up looking like Fig. 1. (There is no reliable way to distinguish jets that come from quarks from those that come from antiquarks or gluons; nor can anything distinguish one particle that leaves no trace from a different one.)
Unfortunately the background on the right is far more common than the signal on the left, so one can’t easily tell whether the signal is present just by looking for collisions that look like Fig. 1. But fortunately the Standard Model is well-known and its equations can be used to predict the background. Though the calculations are hard and the proton is complicated, we can do a pretty good job of predicting the rate for the background and fixing any leftover uncertainties using the data. There are other sources of background too — mainly detector imperfections — that are tougher to control. The experimentalists have to work very hard to make sure they have identified all sources of potential problems.
To get a sense of how hard this problem is, look at Fig. 3, taken from a study by CMS on the 2010 data. In grey is an estimate of the effect of detector mis-measurements, in red the estimated effect of the background shown in Fig. 2. The black triangular dots are the data, showing good agreement with the background estimate — the experimentalists have done their job well. Notice the scale on the vertical axis — it is logarithmic [each numbered tick-mark represents a number 10 times larger than the previous]. 10 with a 5 superscript is 100,000, so many of the bins at left have tens of thousands of events, while others in the middle have just 1. [Where the numbers are less than 1, an average is implied; thus 0.1 events in a bin means that in ten identical experiments, nine would see 0 events and one would see 1 event in that bin.] In pink is the size and shape of a hypothetical signal that a particular variant of supersymmetry, with a particular choice of superpartner masses, would have generated if it were present. Notice it would have added 1 or 2 events in almost every bin from around 250 GeV to about 500 GeV — in other words, out in this part of the plot the background is small and the signal could have been detected. But since nothing like that is seen in Fig. 3 — most of those bins have zero events — this variant of supersymmetry, along with many others which would show up similarly, is excluded by this graph.
Results presented at the Euro-Physics conference of July 2011 (we will see what happens at the conferences of late August) indicate that neither ATLAS nor CMS yet sees signs of excess events of this type. Nor do they see anything in their other searches for supersymmetry. The analysis of the 2011 data means that many variants of supersymmetry that were not excluded previously are now ruled out.
In particular, if a variant of supersymmetry has either the gluinos or squarks — the partners of the gluons and quarks, placed at upper right in Fig. 4 (from this article on supersymmetry) — with masses around or below 800 to 1000 GeV, then that variant has now probably been excluded. (Many caveats, see below.) This is a major advance over what we knew in March from the 2010 data set. As you may remember from earlier in this article (3rd paragraph), this mass range is viewed by most theorists as the sweet spot for supersymmetry! The 2011 LHC results are putting this sweet spot under pressure.
So supersymmetry might seem to be in considerable trouble with LHC data. Yes… well… this statement isn’t false, but it requires a big caveat. It is true, as long as the three assumptions detailed in this article on how to look for supersymmetry are correct (see in particular Figure 6 and the surrounding discussion). The assumptions are
- in any process, the number of superpartners can only change by an even number;
- the lightest superpartner [which is stable, by assumption 1] is a superpartner of a particle we know (and therefore, to avoid conflict with other data, an undetectable neutralino or sneutrino);
- the superpartners that are affected by the strong nuclear force are significantly heavier than the other superpartners of known particles.
While these assumptions are true in the variants of supersymmetry most popular among particle physicists (especially in Europe), and these variants have seen a significant tightening of the noose, we should resist the temptation to make an unjustifiable logical jump to a statement that “supersymmetry, as a theory that solves the hierarchy problem, is now pretty much ruled out”. One should always be cautious in moving from a restricted empirical observation to a general, existential claim about nature.
Instead, we ought to say what we mean: “Data now significantly constraints certain variants of supersymmetry.” Should one of the assumptions listed above prove false, a broad conclusion now about the absence of supersymmetry will look pretty naive in hindsight. Unfortunately such conclusions are being drawn, even back in March, and more recently (though the original version was partially repaired after I objected), by very intelligent and influential experimentalists. Well. It seems to me that we should be patient and avoid drawing conclusions, and inadvertently creating headlines, that are not warranted by the data.
You may well ask about variants of supersymmetry that violate one or more of the three assumptions above. How seriously should we take such variants? How constrained are they by current searches at the LHC? The LHC has certainly made a dent in some of them… But that’s yet another article, to come soon. For now you can look here to learn about what can happen when one of those three assumptions is relaxed.
6 thoughts on “What do current LHC results (mid-August 2011) imply about supersymmetry?”
Dr strassler I hope you are in good health , I would like to ask a question keeps coming to my mind quit frequently , and i hope you answer it to me , since we can receive all types of signals from the universe why can not we receive anything that confirms exactly what happened at the moment this universe was born? and why there is nothing coming from the universe that talks about any possible past universes ? and please pardon my language .
best regards .
Your question is a good one. The issues are (a) whether anything that we can actually measure carries information about the distant past, and (b) whether any process that occurred between the early universe and the present might have erased or scrambled the information beyond recovery.
One problem for us is that the universe was so hot and dense during the Big Bang that it erased much of the information that we might have obtained from that extremely early period. The period of inflation that may have preceded the hot dense period (see below) erased information even more effectively. However, that said, we’re actually doing pretty well
1) The cosmic background microwave radiation (the cold photons left over from the hot Big Bang) tells us a lot about what the universe was doing 100,000 years after the Big Bang
2) If we could measure the cosmic microwave neutrinos (which is a problem of practice, not one of principle — no one has figured out how to do it) that would tell us more details about what the universe was doing just a few minutes after the Big Bang.
3) By studying the bumps and wiggles in the distribution of the microwave radiation across the sky, and studying a few other properties of the current universe, we can infer, by running Einstein’s equations for the universe backward in time, something about what was happening from 100,000 years back to a tiny fraction of a second after the Big Bang began. This is the period at the end of inflation, which probably occurred in the Universe’s history (though this is not yet certain), during which the universe was cold and expanding at a spectacular rate, and following which the universe became hot and dense, and continued expanding more slowly. Not surprisingly, however, the further we go back, the more assumptions we have to make, and the less sure we are of our understanding.
4) Eventually we may understand the inflationary period well enough to be confident that we have the right equations for it (or perhaps, we’ll replace inflation with a more correct history.) Maybe that understanding will permit us to go back further, but until we have it, there’s no way to know.
As I understand it, information can be very twisted or mangled, but it cannot be destroyed, which happens to be a consequence of information entropy (which is just a particular form that the general law of entropy has regarding information).
It is clear that information from events way into the past of our Universe are really twisted and mangled by the high energies involved during the initial moments of the Big Bang, and that the process of inflation did not help with this either, at the same time we cannot assume that the information is not “out there”, even though it could be really distorted in such a way that we do not comprehend well (yet).
After all, this very same argument is what proved Stephen Hawking wrong regarding entropy within black holes.
Kind regards, GEN
I do agree with all of the concepts you have introduced for your post.
They are very convincing and will definitely work. Still, the posts
are too brief for novices. May just you please extend them a
bit from subsequent time? Thanks for the post.