Of Particular Significance

As many of you will have already read, the Large Hadron Collider [LHC], located at the CERN laboratory in Geneva, Switzerland, has “restarted”. Well, a restart of such a machine, after two years of upgrades, is not a simple matter, and perhaps we should say that the LHC has “begun to restart”. The process of bringing the machine up to speed begins with one weak beam of protons at a time — with no collisions, and with energy per proton at less than 15% of where the beams were back in 2012. That’s all that has happened so far.

If that all checks out, then the LHC operators will start trying to accelerate a beam to higher energy — eventually to record energy, 40% more than in 2012, when the LHC last was operating.  This is the real test of the upgrade; the thousands of magnets all have to work perfectly. If that all checks out, then two beams will be put in at the same time, one going clockwise and the other counterclockwise. Only then, if that all works, will the beams be made to collide — and the first few collisions of protons will result. After that, the number of collisions per second will increase, gradually. If everything continues to work, we could see the number of collisions become large enough — approaching 1 billion per second — to be scientifically interesting within a couple of months. I would not expect important scientific results before late summer, at the earliest.

This isn’t to say that the current milestone isn’t important. There could easily have been (and there almost were) magnet problems that could have delayed this event by a couple of months. But delays could also occur over the coming weeks… so let’s not expect too much in 2015. Still, the good news is that once the machine gets rolling, be it in May, June, July or beyond, we have three to four years of data ahead of us, which will offer us many new opportunities for discoveries, anticipated and otherwise.

One thing I find interesting and odd is that many of the news articles reported that finding dark matter is the main goal of the newly upgraded LHC. If this is truly the case, then I, and most theoretical physicists I know, didn’t get the memo. After all,

  • dark matter could easily be of a form that the LHC cannot produce, (for example, axions, or particles that interact only gravitationally, or non-particle-like objects)
  • and even if the LHC finds signs of something that behaves like dark matter (i.e. something that, like neutrinos, cannot be directly detected by LHC’s experiments), it will be impossible for the LHC to prove that it actually is dark matter.  Proof will require input from other experiments, and could take decades to obtain.

What’s my own understanding of LHC’s current purpose? Well, based on 25 years of particle physics research and ten years working almost full time on LHC physics, I would say (and I do say, in my public talks) that the coming several-year run of the LHC is for the purpose of

  1. studying the newly discovered Higgs particle in great detail, checking its properties very carefully against the predictions of the “Standard Model” (the equations that describe the known apparently-elementary particles and forces)  to see whether our current understanding of the Higgs field is complete and correct, and
  2. trying to find particles or other phenomena that might resolve the naturalness puzzle of the Standard Model, a puzzle which makes many particle physicists suspicious that we are missing an important part of the story, and
  3. seeking either dark matter particles or particles that may be shown someday to be “associated” with dark matter.

Finding dark matter itself is a worthy goal, but the LHC may simply not be the right machine for the job, and certainly can’t do the job alone.

Why the discrepancy between these two views of LHC’s purpose? One possibility is that since everybody has heard of dark matter, the goal of finding it is easier for scientists to explain to journalists, even though it’s not central.  And in turn, it is easier for journalists to explain this goal to readers who don’t care to know the real situation.  By the time the story goes to press, all the modifiers and nuances uttered by the scientists are gone, and all that remains is “LHC looking for dark matter”.  Well, stay tuned to this blog, and you’ll get a much more accurate story.

Fortunately a much more balanced story did appear in the BBC, due to Pallab Ghosh…, though as usual in Europe, with rather too much supersymmetry and not enough of other approaches to the naturalness problem.   Ghosh also does mention what I described in the italicized part of point 3 above — the possibility of what he calls the “wonderfully evocatively named `dark sector’ ”.  [Mr. Ghosh: back in 2006, well before these ideas were popular, Kathryn Zurek and I named this a “hidden valley”, potentially relevant either for dark matter or the naturalness problem. We like to think this is a much more evocative name.]  A dark sector/hidden valley would involve several types of particles that interact with one another, but interact hardly at all with anything that we and our surroundings are made from.  Typically, one of these types of particles could make up dark matter, but the others would unsuitable for making dark matter.  So why are these others important?  Because if they are produced at the LHC, they may decay in a fashion that is easy to observe — easier than dark matter itself, which simply exits the LHC experiments without a trace, and can only be inferred from something recoiling against it.   In other words, if such a dark sector [or more generally, a hidden valley of any type] exists, the best targets for LHC’s experiments (and other experiments, such as APEX or SHiP) are often not the stable particles that could form dark matter but their unstable friends and associates.

But this will all be irrelevant if the collider doesn’t work, so… first things first.  Let’s all wish the accelerator physicists success as they gradually bring the newly powerful LHC back into full operation, at a record energy per collision and eventually a record collision rate.

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON April 6, 2015

Many of you will have read in the last week that unfortunately (though to no one’s surprise after seeing the data from the Planck satellite in the last few months) the BICEP2 experiment’s claim of a discovery of gravitational waves from cosmic inflation has blown away in the interstellar wind. [For my previous posts on BICEP2, including a great deal of background information, click here.] The BICEP2 scientists and the Planck satellite scientists have worked together to come to this conclusion, and written a joint paper on the subject.  Their conclusion is that the potentially exciting effect that BICEP2 observed (“B-mode polarization of the cosmic microwave background on large scales”; these terms are explained here) was due, completely or in large part, to polarized dust in our galaxy (the Milky Way). The story of how they came to this conclusion is interesting, and my goal today is to explain it to non-experts.  Click here to read more.

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON February 6, 2015

Hope all of you had a good holiday and a good start to the New Year!

I myself continue to be extraordinarily busy as we move into 2015, but I am glad to say that some of that activity involves communicating science to the public.  In fact, a week from today I will be giving a public talk — really a short talk and a longer question/answer period — in Cambridge, just outside of Boston and not far from MIT. This event is a part of the monthly “CafeSci” series, which is affiliated with the famous NOVA science television programs produced for decades by public TV/Radio station WGBH in Boston.

Note for those of you have gone before to CafeSci events: it will be in a new venue, not far from Kendall Square. Here’s the announcement:

Tuesday, January 20th at 7pm (about 1 hour long)
Le Laboratoire Cambridge (NEW LOCATION)
http://www.lelaboratoirecambridge.com/
650 East Kendall St, Cambridge, MA

“The Large Hadron Collider Restarts Soon! What Lies Ahead?”

Speaker: Matthew Strassler

“After a long nap, the Large Hadron Collider [LHC], where the Higgs particle was discovered in 2012, will begin operating again in 2015, with more powerful collisions than before. Now that we know Higgs particles exist, what do we want to know about them? What methods can we use to answer our questions? And what is the most important puzzle that we are hoping the LHC will help us solve?”

Public Transit: Red line to Kendall Square, walk straight down 3rd Street, turn right onto Athenaeum Street, and left onto East Kendall

Parking: There is a parking deck – the 650 East Kendall Street Garage – accessible by Linskey Way.

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON January 13, 2015

Triggering is an essential part of the Large Hadron Collider [LHC]; there are so many collisions happening each second at the LHC, compared to the number that the experiments can afford to store for later study, that the data about most of the collisions (99.999%) have to be thrown away immediately, completely and permanently within a second after the collisions occur.  The automated filter, partly hardware and partly software, that is programmed to make the decision as to what to keep and what to discard is called “the trigger”.  This all sounds crazy, but it’s necessary, and it works.   Usually.

Let me give you one very simple example of how things can go wrong, and how the ATLAS and CMS experiments [the two general purpose experiments at the LHC] attempted to address the problem.  Before you read this, you may want to read my last post, which gives an overview of what I’ll be talking about in this one.

Click here to read the rest of the article…

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON December 4, 2014

I’m a few days behind (thanks to an NSF grant proposal that had to be finished last week) but I wanted to write a bit more about my visit to CERN, which concluded Nov. 21st in a whirlwind of activity. I was working full tilt on timely issues related to Run 2 of the Large Hadron Collider [LHC], currently scheduled to start early next May.   (You may recall the LHC has been shut down for repairs and upgrades since the end of 2012.)

A certain fraction of my time for the last decade has been taken up by concerns about the LHC experiments’ ability to observe new long-lived particles, specifically ones that aren’t affected by the electromagnetic or strong nuclear forces. (Long-lived particles that are affected by those forces are easier to search for, and are much more constrained by the LHC experiments.  More about them some other time.)

This subject is important to me because it is a classic example of how the trigger systems at LHC experiments could fail us — whereby a spectacular signal of a new phenomena could be discarded and lost in the very process of taking and storing the data! If no one thinks carefully about the challenges of finding long-lived particles in advance of running the LHC, we can end up losing a huge opportunity, unnecessarily. Fortunately some of us are thinking about it, but we are small in number. It is an uphill battle for those experimenters within ATLAS and CMS [the two general purpose experiments at the LHC] who are working hard to make sure they have the required triggers available. I can’t tell you how many times people within the experiments — even at the Naturalness conference I wrote about recently — have told me “such efforts are hopeless”… despite the fact that their own experiments have actually shown, already in public and in some cases published measurements (including this, this, this, this, this, and this), that it is not. Conversely, many completely practical searches for long-lived particles have not been carried out, often because there was no trigger strategy able to capture them, or because, despite the events having been recorded, no one at ATLAS or CMS has had time or energy to actually search through their data for this signal.

Now what is meant by “long-lived particles”? (more…)

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON December 2, 2014

Greetings from the last day of the conference “Naturalness 2014“, where theorists and experimentalists involved with the Large Hadron Collider [LHC] are discussing one of the most widely-discussed questions in high-energy physics: are the laws of nature in our universe “natural” (= “generic”), and if not, why not? It’s so widely discussed that one of my concerns coming in to the conference was whether anyone would have anything new to say that hadn’t already been said many times.

What makes the Standard Model’s equations (which are the equations governing the known particles, including the simplest possible Higgs particle) so “unnatural” (i.e. “non-generic”) is that when one combines the Standard Model with, say, Einstein’s gravity equations. or indeed with any other equations involving additional particles and fields, one finds that the parameters in the equations (such as the strength of the electromagnetic force or the interaction of the electron with the Higgs field) must be chosen so that certain effects almost perfectly cancel, to one part in a gazillion* (something like 10³²). If this cancellation fails, the universe described by these equations looks nothing like the one we know. I’ve discussed this non-genericity in some detail here.

*A gazillion, as defined on this website, is a number so big that it even makes particle physicists and cosmologists flinch. [From Old English, gajillion.]

Most theorists who have tried to address the naturalness problem have tried adding new principles, and consequently new particles, to the Standard Model’s equations, so that this extreme cancellation is no longer necessary, or so that the cancellation is automatic, or something to this effect. Their suggestions have included supersymmetry, warped extra dimensions, little Higgs, etc…. but importantly, these examples are only natural if the lightest of the new particles that they predict have masses that are around or below 1 TeV/c², and must therefore be directly observable at the LHC (with a few very interesting exceptions, which I’ll talk about some other time). The details are far too complex to go into here, but the constraints from what was not discovered at LHC in 2011-2012 implies that most of these examples don’t work perfectly. Some partial non-automatic cancellation, not at one part in a gazillion but at one part in 100, seems to be necessary for almost all of the suggestions made up to now.

So what are we to think of this? (more…)

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON November 17, 2014

Search

Buy The Book

Reading My Book?

Got a question? Ask it here.

Media Inquiries

For media inquiries, click here.