Category Archives: LHC News

CMS and ATLAS present their results

CMS results are being presented by Jim Olsen of Princeton University.

CMS has magnet problems this year due to cooling system problems but was able to record 3/4 of the data with the magnet on.

The diboson excess widely discussed this summer is, perhaps not surprisingly, not confirmed.  Same for the old dilepton excesses.

With certain assumptions, limits on gluinos jump from 1.3 TeV – 1.4 TeV to 1.6-1.7 TeV.

Big improvement in limits on “Black Holes” or anything else dramatic at very high energy (as we saw also in my post yesterday about ATLAS multijet events.)

Top-primes — limits jump to about 950 GeV relative to 800, again with assumptions.

Some new limits on invisible particles.  W’ resonances ruled out up to 4.2 TeV if they decay to leptons, to 2.4 TeV if they decay to top quark + bottom antiquark (with assumptions.)  No dijet bumps or other unusual dijet behavior.  No dilepton bumps up to 2.5 – 3.1 TeV for simple assumptions.

Diphotons (with 2.6 inverse fb of data)! (Olsen shows an event at 745 GeV).  All diphoton events used.  Peak?  Yes!!  BUT: local 2.6 standard deviations, and with the look elsewhere effect, only 1.2 standard deviations. Not impressive.   Such a peak is not inconsistent with previous results, but doesn’t look like a signal.  Still… combining old and new data we see a signal at 3 standard deviations local, 1.7 standard deviations globally after look elsewhere effect.

Also the peak is rather ragged, though this doesn’t imply anything in particular; it is worth noting.  If you assume the peak comes from a wider bump, the significance goes down.

Now on to ATLAS, with results presented by Marumi Kado (from the French Laboratoire de l’Accelerateur Lineaire and Orsay).

ATLAS has 1.2-1.5 times more useable data than CMS.  This could be important.

Look for Higgs in four leptons.  Big statistical fluke!  They see fewer events than expected! This is, of course, no big deal… if you expect 6 events it is no surprise if you happen to see 2.

No peak in two Z’s at higher mass (i.e. no heavy Higgs seen.)  Some improvement in searches for Heavy Higgs particles decaying to taus at higher mass.

Limits on gluinos (with assumptions) go from 1.2-1.4 TeV to 1.4-1.8 TeV. (Got an improvement by looking for boosted top quarks in the case where gluinos decay to top quarks.)  Bottom squarks (with assumptions) — limits go from 650 GeV to 850 GeV.

The excess in Z + jets + invisible particles in high energy events remains in Run 2, a little smaller than in Run 1 but still there.  [Run 1: 10 expected, 29 observed; Run 2: 10 expected, 21 observed.] CMS still doesn’t see it.  What’s the story here?

Dijets (as I wrote about yesterday.)  Kado shows the highest-energy dijet event ever observed by humans.  Nothing unusual in photon + jet. Nothing in dileptons — limits on typical Z’ bosons in the 3-3.4 TeV range, W’ decaying to leptons limited up to 4.1 TeV,

DIPHOTONS. Here we go.

A completely generic search for photon pairs; nothing special or unusual.  Looking for bump with narrow width up to large width.  3.6 standard deviations local, global significance is 1.9 standard deviations.  Looks amusingly similar to the first hint of a Higgs bump from four years ago!  Large width preferred, as much as 45 GeV. Local significance goes up to 3.9 standard deviations, 2.3 after look elsewhere.  Mass about 750 GeV.  Hmm.  No indication as to why they should have been more efficient than in Run 1, or why such an excess wouldn’t have been seen at Run 1.

WW or ZW or ZZ where there was an excess in Run 1.  As with CMS, no excess seen in Run 2.  WH,ZH: Nothing unusual.

Ok, now for the questions. The diphoton bump seen, with moderate significance in ATLAS and low significance at CMS, is very interesting, but without more information and more thought and discussion, it’s premature to say anything definitive.

Kado says: Run 1 two-photon data was reanalyzed by ATLAS and it is compatible with the Run 2 bump for large width at 1.4 standard deviations, less compatible for narrow width at more than 2 standard deviations.  They have not combined Run 1 and Run 2 data yet.

Kado says: the diphoton excess events look like the background, with no sign of extra energetic jets, invisible particles, etc; nothing that indicates a signal with widely different properties sitting over the standard two-photon background.  (Obviously — if it had been otherwise they could have used this to reduce background and claim something more significant.)  There are about 40 events in the peak region (but how wide is he taking it to be?) Olsen: CMS has 10 events in the same region, too little to say much.

Conclusion?  The Standard Model isn’t dead yet… but we need to watch this closely… or think of another question.




Exciting Day Ahead at LHC

At CERN, the laboratory that hosts the Large Hadron Collider [LHC]. Four years ago, almost to the day. Fabiola Gianotti, spokesperson for the ATLAS experiment, delivered the first talk in a presentation on 2011 LHC data. Speaking to the assembled scientists and dignitaries, she presented the message that energized the physics community: a little bump had shown up on a plot. Continue reading

First Big Results from LHC at 13 TeV

A few weeks ago, the Large Hadron Collider [LHC] ended its 2015 data taking of 13 TeV proton-proton collisions.  This month we’re getting our first look at the data.

Already the ATLAS experiment has put out two results which are a significant and impressive contribution to human knowledge.  CMS has one as well (sorry to have overlooked it the first time, but it isn’t posted on the usual Twiki page for some reason.) Continue reading

LHC Starts Collisions; and a Radio Interview Tonight

In the long and careful process of restarting the Large Hadron Collider [LHC] after its two-year nap for upgrades and repairs, another milestone has been reached: protons have once again collided inside the LHC’s experimental detectors (named ATLAS, CMS, LHCb and ALICE). This is good news, but don’t get excited yet. It’s just one small step. These are collisions at the lowest energy at which the LHC operates (450 GeV per proton, to be compared with the 4000 GeV per proton in 2012 and the 6500 GeV per proton they’ve already achieved in the last month, though in non-colliding beams.) Also the number of protons in the beams, and the number of collisions per second, is still very, very small compared to what will be needed. So discoveries are not imminent!  Yesterday’s milestone was just one of the many little tests that are made to assure that the LHC is properly set up and ready for the first full-energy collisions, which should start in about a month.

But since full-energy collisions are on the horizon, why not listen to a radio show about what the LHC will be doing after its restart is complete? Today (Wednesday May 6th), Virtually Speaking Science, on which I have appeared a couple of times before, will run a program at 5 pm Pacific time (8 pm Eastern). Science writer Alan Boyle will be interviewing me about the LHC’s plans for the next few months and the coming years. You can listen live, or listen later once they post it.  Here’s the link for the program.

Dark Matter: How Could the Large Hadron Collider Discover It?

Dark Matter. Its existence is still not 100% certain, but if it exists, it is exceedingly dark, both in the usual sense — it doesn’t emit light or reflect light or scatter light — and in a more general sense — it doesn’t interact much, in any way, with ordinary stuff, like tables or floors or planets or  humans. So not only is it invisible (air is too, after all, so that’s not so remarkable), it’s actually extremely difficult to detect, even with the best scientific instruments. How difficult? We don’t even know, but certainly more difficult than neutrinos, the most elusive of the known particles. The only way we’ve been able to detect dark matter so far is through the pull it exerts via gravity, which is big only because there’s so much dark matter out there, and because it has slow but inexorable and remarkable effects on things that we can see, such as stars, interstellar gas, and even light itself.

About a week ago, the mainstream press was reporting, inaccurately, that the leading aim of the Large Hadron Collider [LHC], after its two-year upgrade, is to discover dark matter. [By the way, on Friday the LHC operators made the first beams with energy-per-proton of 6.5 TeV, a new record and a major milestone in the LHC’s restart.]  There are many problems with such a statement, as I commented in my last post, but let’s leave all that aside today… because it is true that the LHC can look for dark matter.   How?

When people suggest that the LHC can discover dark matter, they are implicitly assuming

  • that dark matter exists (very likely, but perhaps still with some loopholes),
  • that dark matter is made from particles (which isn’t established yet) and
  • that dark matter particles can be commonly produced by the LHC’s proton-proton collisions (which need not be the case).

You can question these assumptions, but let’s accept them for now.  The question for today is this: since dark matter barely interacts with ordinary matter, how can scientists at an LHC experiment like ATLAS or CMS, which is made from ordinary matter of course, have any hope of figuring out that they’ve made dark matter particles?  What would have to happen before we could see a BBC or New York Times headline that reads, “Large Hadron Collider Scientists Claim Discovery of Dark Matter”?

Well, to address this issue, I’m writing an article in three stages. Each stage answers one of the following questions:

  1. How can scientists working at ATLAS or CMS be confident that an LHC proton-proton collision has produced an undetected particle — whether this be simply a neutrino or something unfamiliar?
  2. How can ATLAS or CMS scientists tell whether they are making something new and Nobel-Prizeworthy, such as dark matter particles, as opposed to making neutrinos, which they do every day, many times a second?
  3. How can we be sure, if ATLAS or CMS discovers they are making undetected particles through a new and unknown process, that they are actually making dark matter particles?

My answer to the first question is finished; you can read it now if you like.  The second and third answers will be posted later during the week.

But if you’re impatient, here are highly compressed versions of the answers, in a form which is accurate, but admittedly not very clear or precise.

  1. Dark matter particles, like neutrinos, would not be observed directly. Instead their presence would be indirectly inferred, by observing the behavior of other particles that are produced alongside them.
  2. It is impossible to directly distinguish dark matter particles from neutrinos or from any other new, equally undetectable particle. But the equations used to describe the known elementary particles (the “Standard Model”) predict how often neutrinos are produced at the LHC. If the number of neutrino-like objects is larger that the predictions, that will mean something new is being produced.
  3. To confirm that dark matter is made from LHC’s new undetectable particles will require many steps and possibly many decades. Detailed study of LHC data can allow properties of the new particles to be inferred. Then, if other types of experiments (e.g. LUX or COGENT or Fermi) detect dark matter itself, they can check whether it shares the same properties as LHC’s new particles. Only then can we know if LHC discovered dark matter.

I realize these brief answers are cryptic at best, so if you want to learn more, please check out my new article.

The LHC restarts — in a manner of speaking —

As many of you will have already read, the Large Hadron Collider [LHC], located at the CERN laboratory in Geneva, Switzerland, has “restarted”. Well, a restart of such a machine, after two years of upgrades, is not a simple matter, and perhaps we should say that the LHC has “begun to restart”. The process of bringing the machine up to speed begins with one weak beam of protons at a time — with no collisions, and with energy per proton at less than 15% of where the beams were back in 2012. That’s all that has happened so far.

If that all checks out, then the LHC operators will start trying to accelerate a beam to higher energy — eventually to record energy, 40% more than in 2012, when the LHC last was operating.  This is the real test of the upgrade; the thousands of magnets all have to work perfectly. If that all checks out, then two beams will be put in at the same time, one going clockwise and the other counterclockwise. Only then, if that all works, will the beams be made to collide — and the first few collisions of protons will result. After that, the number of collisions per second will increase, gradually. If everything continues to work, we could see the number of collisions become large enough — approaching 1 billion per second — to be scientifically interesting within a couple of months. I would not expect important scientific results before late summer, at the earliest.

This isn’t to say that the current milestone isn’t important. There could easily have been (and there almost were) magnet problems that could have delayed this event by a couple of months. But delays could also occur over the coming weeks… so let’s not expect too much in 2015. Still, the good news is that once the machine gets rolling, be it in May, June, July or beyond, we have three to four years of data ahead of us, which will offer us many new opportunities for discoveries, anticipated and otherwise.

One thing I find interesting and odd is that many of the news articles reported that finding dark matter is the main goal of the newly upgraded LHC. If this is truly the case, then I, and most theoretical physicists I know, didn’t get the memo. After all,

  • dark matter could easily be of a form that the LHC cannot produce, (for example, axions, or particles that interact only gravitationally, or non-particle-like objects)
  • and even if the LHC finds signs of something that behaves like dark matter (i.e. something that, like neutrinos, cannot be directly detected by LHC’s experiments), it will be impossible for the LHC to prove that it actually is dark matter.  Proof will require input from other experiments, and could take decades to obtain.

What’s my own understanding of LHC’s current purpose? Well, based on 25 years of particle physics research and ten years working almost full time on LHC physics, I would say (and I do say, in my public talks) that the coming several-year run of the LHC is for the purpose of

  1. studying the newly discovered Higgs particle in great detail, checking its properties very carefully against the predictions of the “Standard Model” (the equations that describe the known apparently-elementary particles and forces)  to see whether our current understanding of the Higgs field is complete and correct, and
  2. trying to find particles or other phenomena that might resolve the naturalness puzzle of the Standard Model, a puzzle which makes many particle physicists suspicious that we are missing an important part of the story, and
  3. seeking either dark matter particles or particles that may be shown someday to be “associated” with dark matter.

Finding dark matter itself is a worthy goal, but the LHC may simply not be the right machine for the job, and certainly can’t do the job alone.

Why the discrepancy between these two views of LHC’s purpose? One possibility is that since everybody has heard of dark matter, the goal of finding it is easier for scientists to explain to journalists, even though it’s not central.  And in turn, it is easier for journalists to explain this goal to readers who don’t care to know the real situation.  By the time the story goes to press, all the modifiers and nuances uttered by the scientists are gone, and all that remains is “LHC looking for dark matter”.  Well, stay tuned to this blog, and you’ll get a much more accurate story.

Fortunately a much more balanced story did appear in the BBC, due to Pallab Ghosh…, though as usual in Europe, with rather too much supersymmetry and not enough of other approaches to the naturalness problem.   Ghosh also does mention what I described in the italicized part of point 3 above — the possibility of what he calls the “wonderfully evocatively named `dark sector’ ”.  [Mr. Ghosh: back in 2006, well before these ideas were popular, Kathryn Zurek and I named this a “hidden valley”, potentially relevant either for dark matter or the naturalness problem. We like to think this is a much more evocative name.]  A dark sector/hidden valley would involve several types of particles that interact with one another, but interact hardly at all with anything that we and our surroundings are made from.  Typically, one of these types of particles could make up dark matter, but the others would unsuitable for making dark matter.  So why are these others important?  Because if they are produced at the LHC, they may decay in a fashion that is easy to observe — easier than dark matter itself, which simply exits the LHC experiments without a trace, and can only be inferred from something recoiling against it.   In other words, if such a dark sector [or more generally, a hidden valley of any type] exists, the best targets for LHC’s experiments (and other experiments, such as APEX or SHiP) are often not the stable particles that could form dark matter but their unstable friends and associates.

But this will all be irrelevant if the collider doesn’t work, so… first things first.  Let’s all wish the accelerator physicists success as they gradually bring the newly powerful LHC back into full operation, at a record energy per collision and eventually a record collision rate.

Giving Public Talk Jan. 20th in Cambridge, MA

Hope all of you had a good holiday and a good start to the New Year!

I myself continue to be extraordinarily busy as we move into 2015, but I am glad to say that some of that activity involves communicating science to the public.  In fact, a week from today I will be giving a public talk — really a short talk and a longer question/answer period — in Cambridge, just outside of Boston and not far from MIT. This event is a part of the monthly “CafeSci” series, which is affiliated with the famous NOVA science television programs produced for decades by public TV/Radio station WGBH in Boston.

Note for those of you have gone before to CafeSci events: it will be in a new venue, not far from Kendall Square. Here’s the announcement:

Tuesday, January 20th at 7pm (about 1 hour long)
Le Laboratoire Cambridge (NEW LOCATION)
650 East Kendall St, Cambridge, MA

“The Large Hadron Collider Restarts Soon! What Lies Ahead?”

Speaker: Matthew Strassler

“After a long nap, the Large Hadron Collider [LHC], where the Higgs particle was discovered in 2012, will begin operating again in 2015, with more powerful collisions than before. Now that we know Higgs particles exist, what do we want to know about them? What methods can we use to answer our questions? And what is the most important puzzle that we are hoping the LHC will help us solve?”

Public Transit: Red line to Kendall Square, walk straight down 3rd Street, turn right onto Athenaeum Street, and left onto East Kendall

Parking: There is a parking deck – the 650 East Kendall Street Garage – accessible by Linskey Way.

At the Naturalness 2014 Conference

Greetings from the last day of the conference “Naturalness 2014“, where theorists and experimentalists involved with the Large Hadron Collider [LHC] are discussing one of the most widely-discussed questions in high-energy physics: are the laws of nature in our universe “natural” (= “generic”), and if not, why not? It’s so widely discussed that one of my concerns coming in to the conference was whether anyone would have anything new to say that hadn’t already been said many times.

What makes the Standard Model’s equations (which are the equations governing the known particles, including the simplest possible Higgs particle) so “unnatural” (i.e. “non-generic”) is that when one combines the Standard Model with, say, Einstein’s gravity equations. or indeed with any other equations involving additional particles and fields, one finds that the parameters in the equations (such as the strength of the electromagnetic force or the interaction of the electron with the Higgs field) must be chosen so that certain effects almost perfectly cancel, to one part in a gazillion* (something like 10³²). If this cancellation fails, the universe described by these equations looks nothing like the one we know. I’ve discussed this non-genericity in some detail here.

*A gazillion, as defined on this website, is a number so big that it even makes particle physicists and cosmologists flinch. [From Old English, gajillion.]

Most theorists who have tried to address the naturalness problem have tried adding new principles, and consequently new particles, to the Standard Model’s equations, so that this extreme cancellation is no longer necessary, or so that the cancellation is automatic, or something to this effect. Their suggestions have included supersymmetry, warped extra dimensions, little Higgs, etc…. but importantly, these examples are only natural if the lightest of the new particles that they predict have masses that are around or below 1 TeV/c², and must therefore be directly observable at the LHC (with a few very interesting exceptions, which I’ll talk about some other time). The details are far too complex to go into here, but the constraints from what was not discovered at LHC in 2011-2012 implies that most of these examples don’t work perfectly. Some partial non-automatic cancellation, not at one part in a gazillion but at one part in 100, seems to be necessary for almost all of the suggestions made up to now.

So what are we to think of this? Continue reading