Category Archives: Particle Physics

How a Trigger Can Potentially Make or Break an LHC Discovery

Triggering is an essential part of the Large Hadron Collider [LHC]; there are so many collisions happening each second at the LHC, compared to the number that the experiments can afford to store for later study, that the data about most of the collisions (99.999%) have to be thrown away immediately, completely and permanently within a second after the collisions occur.  The automated filter, partly hardware and partly software, that is programmed to make the decision as to what to keep and what to discard is called “the trigger”.  This all sounds crazy, but it’s necessary, and it works.   Usually.

Let me give you one very simple example of how things can go wrong, and how the ATLAS and CMS experiments [the two general purpose experiments at the LHC] attempted to address the problem.  Before you read this, you may want to read my last post, which gives an overview of what I’ll be talking about in this one.

Click here to read the rest of the article…

Final Days of Busy Visit to CERN

I’m a few days behind (thanks to an NSF grant proposal that had to be finished last week) but I wanted to write a bit more about my visit to CERN, which concluded Nov. 21st in a whirlwind of activity. I was working full tilt on timely issues related to Run 2 of the Large Hadron Collider [LHC], currently scheduled to start early next May.   (You may recall the LHC has been shut down for repairs and upgrades since the end of 2012.)

A certain fraction of my time for the last decade has been taken up by concerns about the LHC experiments’ ability to observe new long-lived particles, specifically ones that aren’t affected by the electromagnetic or strong nuclear forces. (Long-lived particles that are affected by those forces are easier to search for, and are much more constrained by the LHC experiments.  More about them some other time.)

This subject is important to me because it is a classic example of how the trigger systems at LHC experiments could fail us — whereby a spectacular signal of a new phenomena could be discarded and lost in the very process of taking and storing the data! If no one thinks carefully about the challenges of finding long-lived particles in advance of running the LHC, we can end up losing a huge opportunity, unnecessarily. Fortunately some of us are thinking about it, but we are small in number. It is an uphill battle for those experimenters within ATLAS and CMS [the two general purpose experiments at the LHC] who are working hard to make sure they have the required triggers available. I can’t tell you how many times people within the experiments — even at the Naturalness conference I wrote about recently — have told me “such efforts are hopeless”… despite the fact that their own experiments have actually shown, already in public and in some cases published measurements (including this, this, this, this, this, and this), that it is not. Conversely, many completely practical searches for long-lived particles have not been carried out, often because there was no trigger strategy able to capture them, or because, despite the events having been recorded, no one at ATLAS or CMS has had time or energy to actually search through their data for this signal.

Now what is meant by “long-lived particles”? Continue reading

At the Naturalness 2014 Conference

Greetings from the last day of the conference “Naturalness 2014“, where theorists and experimentalists involved with the Large Hadron Collider [LHC] are discussing one of the most widely-discussed questions in high-energy physics: are the laws of nature in our universe “natural” (= “generic”), and if not, why not? It’s so widely discussed that one of my concerns coming in to the conference was whether anyone would have anything new to say that hadn’t already been said many times.

What makes the Standard Model’s equations (which are the equations governing the known particles, including the simplest possible Higgs particle) so “unnatural” (i.e. “non-generic”) is that when one combines the Standard Model with, say, Einstein’s gravity equations. or indeed with any other equations involving additional particles and fields, one finds that the parameters in the equations (such as the strength of the electromagnetic force or the interaction of the electron with the Higgs field) must be chosen so that certain effects almost perfectly cancel, to one part in a gazillion* (something like 10³²). If this cancellation fails, the universe described by these equations looks nothing like the one we know. I’ve discussed this non-genericity in some detail here.

*A gazillion, as defined on this website, is a number so big that it even makes particle physicists and cosmologists flinch. [From Old English, gajillion.]

Most theorists who have tried to address the naturalness problem have tried adding new principles, and consequently new particles, to the Standard Model’s equations, so that this extreme cancellation is no longer necessary, or so that the cancellation is automatic, or something to this effect. Their suggestions have included supersymmetry, warped extra dimensions, little Higgs, etc…. but importantly, these examples are only natural if the lightest of the new particles that they predict have masses that are around or below 1 TeV/c², and must therefore be directly observable at the LHC (with a few very interesting exceptions, which I’ll talk about some other time). The details are far too complex to go into here, but the constraints from what was not discovered at LHC in 2011-2012 implies that most of these examples don’t work perfectly. Some partial non-automatic cancellation, not at one part in a gazillion but at one part in 100, seems to be necessary for almost all of the suggestions made up to now.

So what are we to think of this? Continue reading

Day 2 At CERN

Day 2 of my visit to CERN (host laboratory of the Large Hadron Collider [LHC]) was a pretty typical CERN day for me. Here’s a rough sketch of how it panned out:

  • 1000: after a few chores, arrived at CERN by tram. Worked on my ongoing research project #1. Answered an email about my ongoing research project #2.
  • 1100: attended a one hour talk, much of it historical, by Chris Quigg, one of the famous experts on “quarkonium” (atom-like objects made from a quark or anti-quark, generally referring specifically to charm and bottom quarks). Charmonium (charm quark/antiquark atoms) was discovered 40 years ago this week, in two very different experiments.
  • 1200: Started work on the talk that I am giving on the afternoon of Day 3 to some experimentalists who work at ATLAS. [ATLAS and CMS are the two general-purpose experimental detectors at the LHC; they were used to discover the Higgs particle.] It involves some new insights concerning the search for long-lived particles (hypothesized types of new particles that would typically decay only after having traveled a distance of at least a millimeter, and possibly a meter or more, before they decay to other particles.)
  • 1230: Working lunch with an experimentalist from ATLAS and another theorist, mainly discussing triggering, and other related issues, concerning long-lived particles. Learned a lot about the new opportunities that ATLAS will have starting in 2015.
  • 1400: In an extended discussion with two other theorists, got a partial answer to a subtle question that arose in my research project #2.
  • 1415: Sent an email to my collaborators on research project #2.
  • 1430: Back to work on my talk for Day 3. Reading some relevant papers, drawing some illustrations, etc.
  • 1600: Two-hour conversation over coffee with an experimentalist from CMS, yet again about triggering, regarding long-lived particles, exotic decays of the Higgs particle, and both at once. Learned a lot of important things about CMS’s plans for the near-term and medium-term future, as well as some of the subtle issues with collecting and analyzing data that are likely to arise in 2015, when the LHC begins running again.

[Why triggering, triggering, triggering? Because if you don’t collect the data in the first place, you can’t analyze it later!  We have to be working on triggering in 2014-2015 before the LHC takes data again in 2015-2018]

  • 1800: An hour to work on the talk again.
  • 1915: Skype conversation with two of my collaborators in research project #1, about a difficult challenge which had been troubling me for over a week. Subtle theoretical issues and heavy duty discussion, but worth it in the end; most of the issues look like they may be resolvable.
  • 2100: Noticed the time and that I hadn’t eaten dinner yet. Went to the CERN cafeteria and ate dinner while answering emails.
  • 2130: More work on the talk for Day 3.
  • 2230: Left CERN. Wrote blog post on the tram to the hotel.
  • 2300: Went back to work in my hotel room.

Day 1 was similarly busy and informative, but had the added feature that I hadn’t slept since the previous day. (I never seem to sleep on overnight flights.) Day 3 is likely to be as busy as Day 2. I’ll be leaving Geneva before dawn on Day 4, heading to a conference.

It’s a hectic schedule, but I’m learning many things!  And if I can help make these huge and crucial experiments more powerful, and give my colleagues a greater chance of a discovery and a reduced chance of missing one, it will all be worth it.

Off to CERN

After a couple of months of hard work on grant writing, career plans and scientific research, I’ve made it back to my blogging keyboard.  I’m on my way to Switzerland for a couple of weeks in Europe, spending much of the time at the CERN laboratory. CERN, of course, is the host of the Large Hadron Collider [LHC], where the Higgs particle was discovered in 2012. I’ll be consulting with my experimentalist and theorist colleagues there… I have many questions for them. And I hope they’ll have many questions for me too, both ones I can answer and others that will force me to go off and think for a while.

You may recall that the LHC was turned off (as planned) in early 2013 for repairs and an upgrade. Run 2 of the LHC will start next year, with protons colliding at an energy of around 13 TeV per collision. This is larger than in Run 1, which saw 7 TeV per collision in 2011 and 8 TeV in 2012.  This increases the probability that a proton-proton collision will make a Higgs particle, which has a mass of 125 GeV/c², by about a factor of 2 ½.  (Don’t try to figure that out in your head; the calculation requires detailed knowledge of what’s inside a proton.) The number of proton-proton collisions per second will also be larger in Run 2 than in Run 1, though not immediately. In fact I would not be surprised if 2015 is mostly spent addressing unexpected challenges. But Run 1 was a classic: a small pilot run in 2010 led to rapid advances in 2011 and performance beyond expectations in 2012. It’s quite common for these machines to underperform at first, because of unforeseen issues, and outperform in the long run, as those issues are solved and human ingenuity has time to play a role. All of which is merely to say that I would view any really useful results in 2015 as a bonus; my focus is on 2016-2018.

Isn’t it a bit early to be thinking about 2016? No, now is the time to be thinking about 2016 triggering challenges for certain types of difficult-to-observe phenomena. These include exotic, unexpected decays of the Higgs particle, or other hard-to-observe types of Higgs particles that might exist and be lurking in the LHC’s data, or rare decays of the W and Z particle, and more generally, anything that involves a particle whose (rest) mass is in the 100 GeV/c² range, and whose mass-energy is therefore less than a percent of the overall proton-proton collision energy. The higher the collision energy grows, the harder it becomes to study relatively low-energy processes, even though we make more of them. To be able to examine them thoroughly and potentially discover something out of place — something that could reveal a secret worth even more than the Higgs particle itself — we have to become more and more clever, open-minded and vigilant.

Auroras — Quantum Physics in the Sky — Tonight?

Maybe. If we collectively, and you personally, are lucky, then maybe you might see auroras — quantum physics in the sky — tonight.

Before I tell you about the science, I’m going to tell you where to get accurate information, and where not to get it; and then I’m going to give you a rough idea of what auroras are. It will be rough because it’s complicated and it would take more time than I have today, and it also will be rough because auroras are still only partly understood.

Bad Information

First though — as usual, do NOT get your information from the mainstream media, or even the media that ought to be scientifically literate but isn’t. I’ve seen a ton of misinformation already about timing, location, and where to look. For instance, here’s a map from AccuWeather, telling you who is likely to be able to see the auroras.

Don't believe this map by AccuWeather.  Oh, sure, they know something about clouds.  But auroras, not much.

Don’t believe this map by AccuWeather. Oh, sure, they know something about clouds. But auroras, not much.

See that line below which it says “not visible”? This implies that there’s a nice sharp geographical line between those who can’t possibly see it and those who will definitely see it if the sky is clear. Nothing could be further than the truth. No one knows where that line will lie tonight, and besides, it won’t be a nice smooth curve. There could be auroras visible in New Mexico, and none in Maine… not because it’s cloudy, but because the start time of the aurora can’t be predicted, and because its strength and location will change over time. If you’re north of that line, you may see nothing, and if you’re south of it you still might see something.  (Accuweather also says that you’ll see it first in the northeast and then in the midwest.  Not necessarily.  It may become visible across the U.S. all at the same time.  Or it may be seen out west but not in the east, or vice versa.)

Auroras aren’t like solar or lunar eclipses, absolutely predictable as to when they’ll happen and who can see them. They aren’t even like comets, which behave unpredictably but at least have predictable orbits. (Remember Comet ISON? It arrived exactly when expected, but evaporated and disintegrated under the Sun’s intense stare.) Auroras are more like weather — and predictions of auroras are more like predictions of rain, only in some ways worse. An aurora is a dynamic, ever-changing phenomenon, and to predict where and when it can be seen is not much more than educated guesswork. No prediction of an aurora sighting is EVER a guarantee. Nor is the absence of an aurora prediction a guarantee one can’t be seen; occasionally they appear unexpectedly.  That said, the best chance of seeing one further away from the poles than usual is a couple of days after a major solar flare — and we had one a couple of days ago.

Good Information and How to Use it

If you want accurate information about auroras, you want to get it from the Space Weather Prediction Center, click here for their main webpage. Look at the colorful graph on the lower left of that webpage, the “Satellite Environment Plot”. Here’s an example of that plot taken from earlier today:

The "Satellite Environment Plot" from earlier today; focus your attention on the two lower charts, the one with the red and blue wiggly lines (GOES Hp) and on the one with the bars (Kp Index).  How to use them is explained in the text.

The “Satellite Environment Plot” from earlier today; focus your attention on the two lower charts, the one with the red and blue wiggly lines (GOES Hp) and on the one with the bars (Kp Index). How to use them is explained in the text.

There’s a LOT of data on that plot, but for lack of time let me cut to the chase. The most important information is on the bottom two charts. Continue reading

BICEP2’s Cosmic Polarization: Published, Reduced in Strength

I’m busy dealing with the challenges of being in a quantum superposition, but you’ve probably heard: BICEP2’s paper is now published, with some of its implicit and explicit claims watered down after external and internal review. The bottom line is as I discussed a few weeks ago when I described the criticism of the interpretation of their work (see also here).

  • There is relatively little doubt (but it still requires confirmation by another experiment!) that BICEP2 has observed interesting polarization of the cosmic microwave background (specifically: B-mode polarization that is not from gravitational lensing of E-mode polarization; see here for more about what BICEP2 measured)
  • But no one, including BICEP2, can say for sure whether it is due to ancient gravitational waves from cosmic inflation, or to polarized dust in the galaxy, or to a mix of the two; and the BICEP2 folks are explicitly less certain about this, in the current version of their paper, than in their original implicit and explicit statements.

And we won’t know whether it’s all just dust until there’s more data, which should start to show up in coming months, from BICEP2 itself, from Planck, and from other sources. However, be warned: the measurements of the very faint dust that might be present in BICEP2’s region of the sky are extremely difficult, and the new data might not be immediately convincing. To come to a consensus might take a few years rather than a few months.  Be patient; the process of science, being self-correcting, will eventually get it straight, but not if you rush it.

Sorry I haven’t time to say more right now.

Modern Physics: Increasingly Vacuous

One of the concepts that’s playing a big role in contemporary discussions of the laws of nature is the notion of “vacua”, the plural of the word “vacuum”. I’ve just completed an article about what vacua are, and what it means for a universe to have multiple vacua, or for a theory that purports to describe a universe to predict that it has multiple vacua. In case you don’t want to plunge right in to that article, here’s a brief summary of why this is interesting and important.

Outside of physics, most people think of a vacuum as being the absence of air. For physicists thinking about the laws of nature, “vacuum” means space that has been emptied of everything — at least, emptied of everything that can actually be removed. That certainly means removing all particles from it. But even though vacuum implies emptiness, it turns out that empty space isn’t really that empty. There are always fields in that space, fields like the electric and magnetic fields, the electron field, the quark field, the Higgs field. And those fields are always up to something.

First, all of the fields are subject to “quantum fluctuations” — a sort of unstoppable jitter that nothing in our quantum world can avoid.  [Sometimes these fluctuations are referred to as “virtual particles”; but despite the name, those aren’t particles.  Real particles are well-behaved, long-lived ripples in those fields; fluctuations are much more random.] These fluctuations are always present, in any form of empty space.

Second, and more important for our current discussion, some of the fields may have average values that aren’t zero. [In our own familiar form of empty space, the Higgs field has a non-zero average value, one that causes many of the known elementary particles to acquire a mass (i.e. a rest mass).] And it’s because of this that the notion of vacuum can have a plural: forms of empty space can differ, even for a single universe, if the fields of that universe can take different possible average values in empty space. If a given universe can have more than one form of empty space, we say that “it has more than one vacuum”.

There are reasons to think our own universe might have more than one form of vacuum — more than just the one we’re familiar with. It is possible that the Standard Model (the equations used to describe all of the known elementary particles, and all the known forces except gravity) is a good description of our world, even up to much higher energies than our current particle physics experiments can probe. Physicists can predict, using those equations, how many forms of empty space our world would have. And their calculations show that our world would have (at least) two vacua: the one we know, along with a second, exotic one, with a much larger average value for the Higgs field. (Remember, this prediction is based on the assumption that the Standard Model’s equations apply in the first place.)  An electron in empty space would have a much larger mass than the electrons we know and love (and need!)

The future of the universe, and our understanding of how the universe came to be, might crucially depend on this second, exotic vacuum. Today’s article sets the stage for future articles, which will provide an explanation of why the vacua of the universe play such a central role in our understanding of nature at its most elemental.