Greetings from the last day of the conference “Naturalness 2014“, where theorists and experimentalists involved with the Large Hadron Collider [LHC] are discussing one of the most widely-discussed questions in high-energy physics: are the laws of nature in our universe “natural” (= “generic”), and if not, why not? It’s so widely discussed that one of my concerns coming in to the conference was whether anyone would have anything new to say that hadn’t already been said many times.
What makes the Standard Model’s equations (which are the equations governing the known particles, including the simplest possible Higgs particle) so “unnatural” (i.e. “non-generic”) is that when one combines the Standard Model with, say, Einstein’s gravity equations. or indeed with any other equations involving additional particles and fields, one finds that the parameters in the equations (such as the strength of the electromagnetic force or the interaction of the electron with the Higgs field) must be chosen so that certain effects almost perfectly cancel, to one part in a gazillion* (something like 10³²). If this cancellation fails, the universe described by these equations looks nothing like the one we know. I’ve discussed this non-genericity in some detail here.
*A gazillion, as defined on this website, is a number so big that it even makes particle physicists and cosmologists flinch. [From Old English, gajillion.]
Most theorists who have tried to address the naturalness problem have tried adding new principles, and consequently new particles, to the Standard Model’s equations, so that this extreme cancellation is no longer necessary, or so that the cancellation is automatic, or something to this effect. Their suggestions have included supersymmetry, warped extra dimensions, little Higgs, etc…. but importantly, these examples are only natural if the lightest of the new particles that they predict have masses that are around or below 1 TeV/c², and must therefore be directly observable at the LHC (with a few very interesting exceptions, which I’ll talk about some other time). The details are far too complex to go into here, but the constraints from what was not discovered at LHC in 2011-2012 implies that most of these examples don’t work perfectly. Some partial non-automatic cancellation, not at one part in a gazillion but at one part in 100, seems to be necessary for almost all of the suggestions made up to now.
So what are we to think of this?
- Maybe one of the few examples that is entirely natural and is still consistent with current data is correct, and will turn up at the LHC in 2015 or 2016 or so, when the LHC begins running at higher energy per collision than was available in 2011-2012.
- Maybe one of the examples that isn’t entirely natural is correct. After all, one part in 100 isn’t awful to contemplate, unlike one part in a gazillion. We do know of other weird things about the world that are improbable, such as the fact that the Sun and the Moon appear to be almost exactly the same size in the Earth’s sky. So maybe our universe is slightly non-generic, and therefore discoveries of new particles that we might have expected to see in 2011-2012 are going to be delayed until 2015 or beyond.
- Maybe naturalness is simply not a good guide to guessing our universe’s laws, perhaps because the universe’s history, or its structure, forced it to be extremely non-generic, or perhaps because the universe as a whole is generic but huge and variegated (this is often called a “multiverse”, but be careful, because that word is used in several very different ways — see here for discussion) and we can only live in an extremely non-generic part of it.
- Maybe naturalness is not a good guide because there’s something wrong with the naturalness argument, perhaps because quantum field theory itself, on which the argument rests, or some other essential assumption, is breaking down.
Some of the most important issues at this conference are: how can we determine experimentally which of these possibilities is correct (or whether another we haven’t thought of is correct)? In this regard, what measurements do we need to make at the LHC in 2015 and beyond? What theoretical directions concerning naturalness have been underexplored, and might any of them suggest new measurements at LHC (or elsewhere) that have not yet been attempted?
I am afraid my time is too limited to report on highlights. Most of the progress reported at this conference has been incremental rather than major steps; there weren’t any big new solutions to the naturalness problem proposed. But it has been a good opportunity for an exchange of ideas among theorists and experimentalists, with a number of new approaches to LHC measurements being presented and discussed, and with some interesting conversation regarding the theoretical and conceptual issues surrounding naturalness, selection bias (sometimes called “anthropics”), and the behavior of quantum field theory.