Day 2 of the Higgs Symposium is flying by, with very interesting presentations… and with little time for me to finish the last details of my own talk for tomorrow. (Tomorrow’s program includes a talk by Professor Peter Higgs himself!) But here’s a quick summary.
In my last post I mentioned a couple of the early talks; here’s a bit more about the later talks from yesterday, and then a bit about the first part of today. Caveat: all descriptions below are brief and necessarily incomplete!
Joe Incandela, spokesman for the CMS experiment at the Large Hadron Collider [LHC], presented the Higgs results for CMS. The data shown has been seen before; but Incandela provided some public insight into why the CMS experiment has not updated their results searching for Higgs particles decaying to two photons. If I understand correctly what he said, the situation is roughly the following. In the run-up to the November conference in Kyoto, after re-blinding their data and attempting to improve their techniques (including recalibration of their measurements of photon energies), they looked at their data after unblinding and found some changes from their results in July that were somewhat larger than they naively expected. Concerned they might have made an error, or at least underestimated the size of the uncertainties on their results, they have been going through their methods with a fine-tooth comb, doing many studies. These efforts have convinced them that no mistake was made, and that their new results are indeed consistent with the older ones, given the size of the uncertainties that they quoted in July. They’re now applying what they’ve learned to the full 2011-2012 data set, and they aim to present an updated result, and the above-mentioned studies checking their results, for the spring — perhaps as soon as March, perhaps a bit later.
Eilam Gross, co-convener of the Higgs search group at ATLAS, similarly presented results for the ATLAS experiment. ATLAS, you may recall, also had a little surprise in their recent data, which is why they did not update their results in Kyoto for Higgs decaying either to photons or to two lepton/anti-lepton pairs (often called “four leptons” for short). We learned about this surprise in December; the two measurements give values for the Higgs mass that differ by a somewhat surprising amount. ATLAS, like CMS, chose to do many studies to make sure they had made no errors before they released their data; and similarly they’ve concluded there are no errors in their results. At this point it is widely believed that their mass discrepancy involves an unlikely (but possible) statistical fluke, probably combined with some uncertainties in energy measurements.
An aside: every time in my career that a new particle has been studied for the first time, there was some funny business in the data early on. This is just something that happens when you have small amounts of data; there are so many weird things that could happen that the probability that one of them will happen is higher than you think. There’s no evidence that there’s anything truly odd going on.
Riccardo Rattazzi (professor at EPFL in Lausanne, who has shown up on this blog a couple of times before, here and here) then gave a beautiful talk about the possibility that the Higgs particle is a composite object, the way the proton is a composite object made from smaller things. This possibility is now highly constrained, but not ruled out yet; for it to work presumably requires that the matter particles of the world (the quarks and leptons) are partly composite (meaning they are mixtures of elementary particles and composite particles.) A generic feature of these theories is that there is at least one “top partner”, a particle resembling the top quark but heavier, with a mass below about 1 TeV. Searches for such particles at the LHC are now pushing the mass of such particles into the 600-700 GeV; they will get a bit higher using the 2012 data, but ruling out the full reasonable range of possibilities will require waiting for the next proton-proton collision run of the LHC, beginning in 2015.
Sir Michael Atiyah, one of the world’s great mathematicians, whose work has had enormous influence in physics, gave a talk about the relationships between Higgs phenomena and solitons — in particular, magnetic monopoles, instantons and Skyrmions. I won’t go into these interesting objects here, as it would require a long set of articles, but the talk highlighted the role of fields like the Higgs field in both physics and mathematics of the past forty years.
That was yesterday. As for today…
Howie Haber (professor at the University of California Santa Cruz) gave a fantastic talk about the possibilities, which he has studied actively over his career, that there is more than one Higgs field in nature. His talk has so many interesting features that I think I’ll devote an entire post to it next week. But the most important element of his talk has to do with what is known as the “decoupling limit”, whereby there can easily and naturally be a Higgs particle that resembles, but is not, a Higgs of the simplest type — a so-called “Standard Model Higgs”. The existence of this limit, which he and Yossi Nir described in 1989, explains why, as everyone needs to keep in mind, the fact that the new Higgs-like particle resembles a Standard Model Higgs is not, by itself, strong evidence that the Standard Model is the correct and complete description of all physical phenomena at the LHC.
Next, Misha Shaposhnikov, one of Rattazzi’s colleagues at EPFL, who with Christof Wetterich suggested a scenario that predicts a Standard Model Higgs with a mass in roughly the 123-135 GeV/c² range, gave arguments in favor of his prediction, discussed its implications, and talked about whether it would allow the Higgs to serve as the driver of cosmological inflation (which is the rapid expansion of the early universe thought to explain why the universe is so uniform and geometrically flat). This also deserves a longer discussion, which I’ll try to provide soon; suffice it to say that the theoretical arguments underlying the prediction are open to question, though the calculations which give the prediction are solid. Unfortunately, testing whether this prediction does give the right answer, and whether it does so for a deep reason or just by coincidence, is going to be very difficult for the foreseeable future. I’ll go into this later.
All of these talks are being or will be posted on-line so you’ll be able to read them on your own if you’re sufficiently expert. I’ll try to boil them some of them down for wider readership next week.
We’re heading into the afternoon talks, so I’ll stop here.