Some news on the Higgs particle from the ATLAS and CMS experiments, the two general purpose experiments at the Large Hadron Collider. I just mention a few highlights.
First, you may recall a tempest in a teapot that erupted in late 2012, when ATLAS’s two measurements of the Higgs particle’s mass disagreed with each other by more than one would normally expect. This generated some discussion over coffee breaks, and some silly articles in on-line science magazines, even in Scientific American. But many reasonable scientists presumed that this was likely a combination of a statistical fluke and a small measurement problem of some kind at ATLAS. The new results support this interpretation. ATLAS, through some hard work that will be described in papers that will appear within the next couple of days, has greatly improved their measurements, with the effect that now the discrepancy between the two measurements, now dominated by statistical uncertainties, has gone down from nearly 3 standard deviations to 2 standard deviations, which certainly isn’t anything to get excited about. Experts will be very impressed at the reduction in the ATLAS systematic uncertainties, arrived at through significantly improved energy calibrations for electrons, photons and muons.
Experts: More specifically, the measured mass of the Higgs in its decay to two photons decreased by 0.8 GeV/c², and the systematic uncertainty on the measurement dropped from 0.7 GeV/c2 to 0.28 GeV/c2. And by the way, the rate for this process is now only 1 standard deviation higher than predicted for the simplest possible type of Higgs (a “Standard Model Higgs“); it was once 2 standard deviations high, which got a lot of attention, but was apparently just a fluke.
Meanwhile, for the decays to two lepton/anti-lepton pairs, the systematic error has dropped by a factor of ten — truly remarkable — from 0.5 GeV/c2 to 0.05 GeV/c2. The Higgs mass measurement itself has increased by 0.2 GeV/c2.
Second, as reported by the CMS experiment a couple of months ago, the lifetime of the Higgs particle has been constrained, using a clever method developed in papers by Kauer and Passarino, by Campbell, Ellis and Williams, and by Caola and Melnikov. For a “Standard Model Higgs” (the simplest possible type of Higgs particle) that has a mass around 125 GeV/c², the lifetime of the Higgs is predicted to be about 150 trillionths of a trillionth of a second. According to CMS, the lifetime has now been measured to be at
most 6 times that large [oops! at least 1/6th as long a lifetime as predicted — sorry] (though there’s still some debate about how precise that measurement really is.) [No, we don’t measure this lifetime with a stopwatch. The clever method involves noting that a Higgs particle with an unexpectedly short lifetime can lead, via a quantum uncertainty principle, to a big increase in the rate for the production of pairs of real Z particles. (Recall the Higgs particle itself can only decay to one real and one virtual Z particle.)]
Third, both experiments are trying to make measurements of the Higgs particle decaying to tau lepton/anti-lepton pairs and to bottom quark/anti-quark pairs. The measurements aren’t very precise yet, but there’s now strong evidence for the tau decays. And a rough measurement of the Higgs particle’s mass in its tau decays is within a few GeV/c² of the measurements made via the more precise methods mentioned above… so all seems consistent.
51 thoughts on “Some Higgs News from the LHCP Conference”
Is it fair to say that all of these results are solidly consistent with the Standard Model, with no chinks or cracks (yet) showing the possible light of new physics?
no chinks yet.
Did you mean measured lifetime or measured line width? If the measured line width is 6 times that predicted by standard model, there is lot of room for exotic decays of Higgs.
Oops! Thanks for catching that. I did mean: the lifetime is measured to be at least 1/6th as long as predicted, and that the “line width” is at most 6 times as large as predicted.
There’s less room than it looks from this result, based on other measurements. But there’s still quite a bit of room.
So, bottom line, what are the two ATLAS Higgs mass numbers now and what are their total margins of error respectively? (And how do they compare to the other experiment’s numbers?)
The operation is success (no chinks), but the patient (unnaturalness) …. ?
The patient is alive and well, which has the doctors somewhat annoyed because they were hoping there’d be some unexpected complication that would end up killing the patient.
Which I’d think would warrant a malpractice suit, but fortunately strained anthropomorphizing metaphors for abstract concepts don’t have standing in our legal systems. 🙂
“… decays to two lepton-dilepton pairs…” — that should be “two lepton-antilepton pairs”.
As we know higgs decay very quickly how can we measured its lifetime and what concept that used to do that…
Qu … to kauer and passsarino
@zooalnoon : It looks like Matt has not gotten around to answer your question. Here is my answer. Experimentalists do not measure lifetime usually. They measure width of the (resonance) peak which is associated with the unstable particle. From uncertainty principle, lifetime is inversely proportional to the width.
Why are particles like electron restricted to being 0.511 MeV (enough energy to make one electron). And why are particles like photons seemingly unrestricted in terms of how much energy it takes to make one.
The electromagnetic spectrum tells me that photons can have eny amount of energy with no quantization in terms of how much energy it takes to make a photon. More specifically, a photon can be made with an arbitrary amount of energy while electron must take 0.511 MeV to make, please explain why this is. Also does this have something to do with the higgs field? Because for example if the higgs field interacted with the electron field more than it does now then electrons would have a higher mass and therefore it would take more energy to make an electron, again I would appreciate it if someone would explain this to me.
It’s the rest mass of the electron that is always .5 MeV, just like the rest mass of the photon is always 0 eV. You can make an electron with more total energy than that, and it will simply have more kinetic energy/momentum. Just like a more energetic photon has more kinetic energy/momentum. The difference is that you always need *at least* 0.5 MeV to make an electron since they can’t have less energy than that, whereas a photon can be made with arbitrarily small energies.
Photon energy is quantized for a given frequency.
However,in physics there is a minimum and maximium possible value for any Physical quantity
The minimum frequency(or equivalently the minimum possible energy)of a photon is not known,but one may speculate that it is not higher than the neutrino rest mass
The maximum photon frequency (or energy) is speculated to be not too far from planck’s mass(energy)
Great update, but one thing that always bugs me is “trillionths of a trillionth” “or millionths of a billionth” etc. Can you please put a bracketed value next to this quantity that makes sense? 150 trilly-trillionths = 1.5*10^-22, right?
This gets me especially since I never know whether to use the American or British system of millions, billions, etc.
Surely everyone uses one million = 10^6 and one billion = 10^9?
Behold my friend, the long and short scales: http://en.wikipedia.org/wiki/Long_and_short_scales
Sure, but does anyone actually use the long scale in any scientific or mathematical discussion?
People shouldn’t use the long scale, but they do, and that’s the problem. Especially when doing things like ‘trillionth of a trillionth’ to avoid using scientific notation which is exactly the situation where most misinformation can occur.
No. Both usages are found, even in scientific and technical discussion, which is why the terms should be abandoned in favor a scientifically well defined terms (e.g. PeV, GeV, MeV, keV, ev, meV), for different numbers of electron volts.
Interesting. Is it mainly authors from specific countries that use the long form in scientific publications?
I understand that using correct notations avoids the problem altogether, but it just seems so strange that anyone would use long form. But presumably that’s just because I grew up using short form.
Thank you for explaining that to me. But why does the electron need to have at least 0.5 mev? what does this have to do with the higgs field?
Ah. It’s the electron’s coupling to the Higgs field, which has a non-zero equilibrium value, that gives the electron its rest mass. You can think of it as just being the electron’s potential wrt the omni-present Higgs that gives it rest mass (as in the energy of a particle not moving). So an electron can’t exist without having 0.5 MeV of energy any more than a 1kg rock on top of a 1km mountain can be there without having ~1kJ of gravitational potential energy.
Why the electron field couples to the Higgs field with the strength that it has is one of those things the Standard Model doesn’t answer, and rather is a value that has to be determined experimentally and then plugged in.
You are right however there is a possibility to make an image of it.
If electrons are all propeller alike entities with the same shape en pitch, and the oscillating energetic Higgs particle field is the energy source to keep the electron in motion and transform Higgs particles into different photonic/gluonic shapes, then you have some of a picture.
see for more details: Alternative proposal for the origin of unexpected large B-Modes found in the BICEP2 measurements. https://tudelft.academia.edu/LeoVuyk/Papers
Thank you I understand now
I so enjoy following your blog that I do not read any others.
It’s extremely educational.
But It looks like your are on vacation now or preparing to.
Please correct me, if I am wrong.
Your fan, bob-2
One question: I always had heard that one seal of approval of string or M-theory was the (remarkable) predictive power of those mathematical models for properties of (known) particles. I was googling that and not getting a clear answer. Did I confuse predictive power in this case more with quantum theory in general? Of course now since the LHC experiments so far haven’t yielded any indications for a new physics, this might harm the reputation of M-theory or superstring theory to be really predictive. What is your opinion on this? Thank you!
@Margot: My understanding is that the main strength of string theory is that it is the only theory which gives unified picture of all the four interactions. As far as I know no other presently known theory does that. It is true that as yet it has not seen any experimental verification. But I am willing to wait, unlike some people who want to reject string theory for lack of experimental verification as of today! Let us see what Matt has to say about this if and when he gets time to answer this question.
BTW, previously we had discussions on metaphysics, which some readers hated. If you are interested in such discussions, you might send me an e-mail at email@example.com. Since everyone in Nigeria knows about my e-mail address, I do not mind publicly stating it!!
Kashyap, thank you, yes, I’d like to participate. Kashyap, is it true that one can predict the properties of known and unknown particles from those mathematical equations of those unified field theories associated with M-theory (or Lagrangian etc.)?
@So far all the experimental predictions which have been verified follow from the standard model which is a subset of string theory. So the opponents say that the verifications have nothing to do with string theory. But personally I still think highly of string theory.
Yes, I am aware of that. I mean that supersymmetric particles have not yet been detected…. Good, thank you….
It is untrue that string or M-theory predict the properties of SM particles. There is no rigorous proof that SM is a subset of of string theory, at best only a conjecture.
Commenter, Are you sure that string theory does not go into SM in some appropriate limit (approximation) ? Is there a reference for your belief? Also, can you address the other point that right now there is no other model which gives unified picture of all interactions? Right or wrong according to current experimental limits!
String theory may go into SM in some approximation since it has a huge number of vacua possible. But there is unconvincing or zero evidence that String theory correctly recovers the mass and flavor content of the SM (in particular its 20+ free parameters). A model that consistently unifies all interactions and is fully supported by observations is missing.
Personally I expect a lot more of a theory “which gives unified picture of all the four interactions.” For example, I expect an answer as to the nature of dark energy. I expect revelations concerning the relationship between the gravitational and electromagnetic force (yes, just as Faraday discovered relationships between the electric force and the magnetic force). REVELATIONS! Not; a theory that describes everything that we knew in 1980 – and the truth is it doesn’t do that! By the way, I don’t blame physicists for all the String Theory hype, and I don’t blame them for the recent String Theory backlash, I blame the popular press.
@Commenter and S. Dino: I think this is just the normal way of how science progresses through controversies and intense debates. Presently, because of internet and Arxiv, the pace of attacks is much too rapid! Although personally, I still have high respect for string theory, I am not losing any sleep over difficulties of string theory!! I am sure something like this happened in the early part of 20th century when GR and QM emerged.
I don’t lose sleep over String theory either, I simply do not think it is to the 21st century as GR and QM were to the 20th century…no way, but of course time will tell. Part of the problem I think is that currently there are no crises in physics, no Michelson/Morley experiment, no ultraviolet catastrophe – it is as you say the way of “normal science”. But I do have faith that sooner or later, through further experimentation and new observations the crises will come as well as new theories that will, through their power of prediction, blow String theory away.
Let me remind everyone that the topic of this post has nothing to do with String theory.
As for the comment that we do not have a “ultraviolet catastrophe” in current theoretical physics, think about the many unresolved puzzles of the SM, in particular, the fine-tuning problem
Fine-tuning and naturalness are upsetting philosophical issues that everyone would like to find a more satisfying answer to than “Well, that’s the universe we got”. However if such an explanation is not available, we can cope with that, because while *unlikely* there’s nothing about our universe that is *impossible* according to our current understanding. Physics still works, we just wouldn’t know why everything has the value it does. Whatever we may be missing isn’t fundamental to making it work.
The UV catastrophe was the opposite — the observation that according to then-current understanding of physics, everyone should be dead, except they never would have been born, because life never could have formed. Yet here we are. Our physics was clearly broken. The thing that was missing was fundamental to resolving this direct contradiction between theory and reality.
So we have issues today, but it’s a pretty fundamental difference in kind if you ask me. Which, granted, you didn’t.
Fine-tuning is NOT an upsetting philosophical issue, it is a serious foundational problem in the Standard Model. The mass of the Higgs boson is affected by large quantum corrections that can bring it close to the Planck mass, unless there is an unnatural fine-tuning cancellation between quadratic quantum corrections and the bare mass. Without supersymmetry, the problem has no obvious solution.
And what is the consequence of the lack of a solution?
Is that we don’t know why the Higgs mass has the value it does, aside from resorting to the Anthropic Principle?
Or is it that, even taking the Higgs mass as a given, the SM predicts that our universe could not possibly support life like us? Or some other direct contradiction with observation?
That’s the difference.
These issues have absolutely NOTHING to do with life on earth or philosophical considerations you alluded to.
The UV catastrophe led Max Planck to the discovery of quanta. Likewise, the fine-tuning problem may or may not lead to a major paradigm change in high-energy theoretical physics. Only time will tell and it’s not up to you to decide.
No sense for me to continue this pointless conversation…
Look, if all the comparison to the UV catastrophe means to you is “tantalizing mystery that could possibly be the key to a new discovery” then sure, obviously mysteries exist today. To me, the significance of the UV catastrophe specifically is that it was a case where a direct prediction of then-current theory in it’s area of applicability was falce-palmingly incorrect against basic qualitative observations like “life exists”, much more so actual empirical measurement. It is not possible that we had a complete understanding of light, that this mystery couldn’t be the
There is no such empirical failure of the Standard Model. There is no alarmingly wrong prediciton, or for that matter wrong prediction at all. The theory doesn’t predict why certain values are what we measure them to be, and it is of course perfectly reasonable to think this could be the loose thread to pull to discover something new. But what if there is no such explanation? What if there is but the information that would let us distinguish is not available at this time in our universe? Nobody knows if that is the case or isn’t. We might never figure it out, and we might never find an area where that lack of knowledge results in an incorrect prediction.
History will decide if naturalness or fine-tuning are signs of a lack of understanding. While at the time it was plainly obvious that the result theory was getting — the UV catastrophe — had to be wrong and there was a better explanation. That is the distinction I was making.
The Standard Model (in his original form) incorrectly predicts that neutrinos are massless and do not mix. It fails to preserve unitarity in scattering of polarized W bosons at sufficiently high momentum. It fails to correctly account for the anomalous magnetic moment of charged leptons.
For many, these are serious signs of trouble. It’s not up to you, or me or anyone else at this point in time to declare that the magnitude of these conceptual problems is not comparable with the situation created by the UV catastrophe.
End of story.
The SM doesn’t predict the mass of any particle. But if you assume a non-zero neutrino mass, then the rest comes out naturally. You can’t just assume the observed black body spectrum and stick that into the old theory of light and have it make sense at all.
The “magnitude” — by which I mean, hopefully uncontroversially, the amount of change to our way of thinking required by the solution to the conundrum at hand — will be for history to decide, once we know the answer.
The difference in nature of the pre-solution conundrum, however, is apparent now and future-history won’t change.
The particle nature of the gauge boson (photon) itself shows there is conceptual problem. At high frequencies, this “rest mass” type could not exist. String theory, in this context, may look like Kaleodoscope ?
Terima kasih, informasi yang Saya dapat dari web ini sangat bermanfaat sekali.
Salam dari :
Poker Online, Game Facebook, Judi Online, Nagapoker
Dear prof. strassler
125Gev particle is almost believed to be Higgs.
How much the understanding why Higgs has small mass is progressed?
Comments are closed.