Of Particular Significance

The First Human-Created Higgs-Like Particle: 1988 or 89, at the Tevatron

POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 07/27/2012

Yesterday’s Quiz Question: when was the first Higgs particle produced by humans? (where admittedly “Higgs” should have read “Higgs-like”) got many answers, but not the one I think is correct. Here’s what I believe is the answer.

——

[UPDATE: After this post was written, but before it went live, commenter bobathon got the right answer — at 6:30 Eastern, just under the wire! Well done!]

The first human-produced Higgs particle [more precisely, the Higgs-like particle with a mass of about 125 GeV/c2 whose discovery was reported earlier this month, and which I’ll refer to as “`H”– but I’ve told you why I think it is a Higgs of some sort] was almost certainly created in the United States, at the Fermilab National Accelerator Center outside Chicago. Back in 1988 and 1989, Fermilab’s accelerator called the Tevatron created collisions within the then-new CDF experiment, during the often forgotten but very important “Run Zero”.  The energy per collision, and the total data collected, were just enough to make it nearly certain that an H particle was created during this run.

Run Zero, though short, was important because it allowed CDF to prove that precision mass measurements were possible at a proton collider.  They made a measurement of the Z particle’s mass that almost rivaled the one made simultaneously at the SLC electron-positron collider.  This surprised nearly everyone. [Unfortunately I was out of town and missed the scene of disbelief, back in 1989, when CDF dropped this bombshell during a conference at SLAC, the SLC’s host laboratory.] Nowadays we take it for granted that the best measurement of the W particle’s mass comes from the Tevatron experiments, and that the Large Hadron Collider [LHC] experiments will measure the H particle’s mass to better than half a percent — but up until Run Zero it was widely assumed to be impossible to make measurements of such quality in the messy environment of collisions that involve protons.

Anyway, it is truly astonishing that we have to go back to 1988-1989 for the first artificially produced Higgs(-like) particle!! I was a first-year graduate student, and had just learned what Higgs particles were; precision measurements of the Z particle were just getting started, and the top quark hadn’t been found yet. It took 23 years to make enough of these Higgs(-like) particles to convince ourselves that they were there, using the power of the CERN laboratory’s Large Hadron Collider [LHC]!

[Perhaps this remarkable history will help you understand why I keep saying that although the LHC experiments haven’t yet found something unexpected in their data, that absolutely doesn’t mean that nothing unexpected is there. What’s new just may be hard to see, waiting to be noticed with more sophisticated methods and/or more data.]

What were the other options?

Several commenters suggested that maybe the first H particle was made at CERN’s SpS collider, which was the first proton/anti-proton collider, used to discover the W and Z particles in the early 1980s. The SpS, running at an energy of first 540 GeV and later 640 GeV per collision, was the first that really had enough energy to make a 125 GeV/c2 particle — in principle. But a particle carrying 125/640, or about 20%, of a proton/anti-proton collision’s energy would be very rarely produced. That’s because most quarks and gluons and anti-quarks inside the proton carry small fractions of the energy and momentum of the proton in which they are embedded, so very, very few gluon-gluon collisions inside a 640 GeV proton-proton collision would have energy as large as 125 GeV. On top of that, the vast majority of gluon-gluon collisions won’t make an H particle even if the gluons’ energy is sufficient. Put these things together and the SpS collider simply didn’t make nearly enough collisions to create a H particle. (More precisely, there’s a very small probability that they did, perhaps a few percent.)

When the Tevatron began running in the mid-to-late 1980s, it produced proton/anti-proton collisions at 1.8 TeV per collision. This increase in a factor of 3 in the energy per collision compared to SpS leads to an increase of about 50 in the probability of making Higgs particles of mass 125 GeV/c2. And in 1988-1989, during Run Zero (which preceded Run Ia, Run Ib and Run II over the ensuing two decades), the Tevatron produced roughly double the number of collisions that the SpS produced during its lifetime. The number of collisions during Run Zero was roughly 300,000,000,000; the probability of making a Higgs particle was a bit more than one per 120,000,000,000. [Technically: the cross-section for Higgs production (including all processes) is about 0.6 per picobarn, and the integrated luminosity of Run 0 was 4.4 picobarn, meaning the average number of Higgs particles produced in such a run would be 2.5 .] Here and throughout I will assume the H is produced at or slightly above the rate for a Higgs of the simplest type (as LHC and Tevatron data currently suggest.) So it is likely that at least one H particle, and probably 2 or 3, were produced during Run Zero; the probability that none were made is below 10%. Congratulations, CDF! you were almost certainly the first… an honor subject to review if the H turns out to be very different in some way from the simplest type of Higgs (the “Standard Model Higgs”)

The following run of the Tevatron — Run I, from 1992 to 1996 — increased the number of collisions over Run 0 by a factor of 25 or so, and did so in two experiments, CDF and the new D0. So it is certain that many tens of H particles were produced in the early-to-mid 1990s. [Robert Garisto, Andrew Foland and Phil Gibbs (second try) got as far as suggesting Run I, but all forgot about Run 0.] By the time the LHC started producing 7 TeV collisions in 2009, the Tevatron had produced many thousands of H particles. And yet, because most Higgs(-like) particles decay in a way that is mimicked by other non-Higgs processes, these numbers were not enough for the Tevatron to see a clear H signal above the background and make the discovery.  Just double the number would probably have been enough for the Tevatron to do it.  Instead, the LHC made something like 400,000 H particles within the ATLAS and CMS experiments, mostly in 2011 and so far in 2012, and that was (just barely) enough.

One other place where I believe H particles were produced was the LEP electron-positron collider, but the numbers were very small.  The process electron + positron –> Higgs is tiny, because electrons and positrons interact so weakly with the Higgs field (that’s why they’re very lightweight.)  Furthermore, unless you tune the electron and positron beams to have exactly half the Higgs’ mass-energy, you won’t make the Higgs often.  So even though LEP’s electron-positron collisions reached 125 GeV in the mid-90s, the probability of making a Higgs-like particle at or near that energy is minuscule.  Instead, the efficient way to make a Higgs(-like) particle at LEP was in the process electron + positron –> Z particle + H particle, where the much stronger interaction of Z and H particles makes the rate much larger. The probability for this process to occur is reasonably large as long as there’s a few GeV more than enough energy to make both a Z and an H; but the mass-energy of the Z is 91 GeV, the mass-energy of the H is 125 GeV, and the maximum energy of the LEP collisions was 209 GeV, obtained in 2000, the last year or so of running the collider.  Painful!  If they’d been able to go to 220 GeV, they’d have found it.

Now in quantum mechanics, you can make a virtual Z particle instead of a real one, just with a lower probability, so in principle 209 GeV is enough to make a 125 GeV Higgs and a slightly virtual Z, with a very low rate. So I think (needs double-checking) LEP probably made a few of these particles.  But it was not enough for a discovery. And in any case, these weren’t the first; by then the Tevatron had made a hundred or so H particles.

Then there’s the HERA electron/proton and positron/proton collider, whose main goal was to study the internal structure of the proton.  At such a machine the dominant production of Higgs particles occurs via the process electron + up quark –> electron + up quark + Higgs  or  neutrino + down quark + Higgs (also known as “vector boson fusion”).  But again HERA didn’t have nearly enough collision energy to make many H particles, and it didn’t start running until 1992 anyway.  That said, it would be amusing to know how many were made there; there’s an unpublished conference proceeding by Bernd Kniehl that worked out the rates, but I haven’t gone through it yet.  I doubt it was more than a handful.

What about the LHC? In 2010, collisions at ATLAS, CMS and LHCb resulted in well over 1000 H particles. The first ones were probably made during the middle of the summer.

Discovery honors belong to Europe’s CERN, and the LHC experiments ATLAS and CMS, with a (probable) honorable mention to the Tevatron for finding some (as yet unconfirmed) evidence. But the Tevatron experiment CDF at Fermilab was almost certainly the host of the first 125 GeV/c2 Higgs(-like) particle produced artificially by humans; and CDF and DZero were the only ones to host such particles, except perhaps a small handful, for twenty years. This is perhaps a source of extreme frustration, more than anything else, especially since the Tevatron’s run over the past decade did not produce as many collisions as originally hoped, and since the Tevatron is now closed, leaving U.S. particle physics with no high-energy frontier machine in operation or in the planning stages.  Yet maybe it is also a source of consolation to know the particles are there, somewhere, in the decades-old data.

Share via:

Twitter
Facebook
LinkedIn
Reddit

29 Responses

  1. What an absolute sickner for the poor old tevatron. It does seem daft that the states had no accelerator sensitive enough to do the detective work, and dafter that CERN is now on its own as far as front line experiments like the Higgs particle search, are concerned.
    I have to say your account of these events was breathlessly exciting. I learned about the standard model four or five months ago, and I find just having that map pasted in my head gives me a simple canvas on which you paint brilliantly. I got the bit about weak electron-positron interaction with the Higgs field, which gives me a buzz in my gut to think about, and I’m able to stumble around Z and W particles too. I mean for a bit, not entirely. Amazing. Thanks.

  2. FWIW I did get that there’d have been a handful of Higgs with off-shell Z’s at each LEP II experiment, but I didn’t keep the relevant plots so you can count that as an unconfirmed rumor. That was why I said LEP II was too late.

  3. 1. Could EM radiation been created by the gravitational field?

    2. Is the wave lengths of photons proportional to the size of the universe (bounded space-time)? i.e. lambda ~ 1 / (Radius of the universe)

    3. Einstein entangled the universal limit, c, of the speed of light into the physics so as not to violate causality, however, why couldn’t the gravitational field propagate at higher velocities? In other words even though the gravitational field behaves like a wave why should it be limit to c?

    It could be higher, much higher, Van Flandern estimated it at a lower limit of 2×10^10 c, orders of magnitude higher than c. For this reason I ask if the gravitational field could be creating EMR since the gravitons (or gravity particles) are well below Planck.s scale and so could create a structured photon construct.

  4. Accepting this as a fundamental axiom , what in your opinion would be some speculative kind of discovery that would in principle solve the problem of the SM values which are introduced by hand ……this is a very fundamental question ……..how any discovery can fix those values ? i predict that no discovery can do this .
    Remember that the M- fantasy theory claimed to solve this and ended with 10^500 of values landscape !!!!

  5. Thanks for your response but i am talking fundamental you talking practical pragmatic , i just wanted to be sure that in principle ANY molecule must have — theoretical– fields configurations equations even if they are beyond any solving capacity or any human means even to state them , my aim is to show that the physical is far far below the rank of the formal.

  6. And I shall abandon my theory that it was Pele in the 1966 world cup. Sigh, another one one bites the dust.

  7. Well, I will stick with CERN 2010 – 2012, rather then – maybe – if – assumptions…

    1. I don’t think the Tevatron is a “maybe – if – assumption”. You can calculate that higgses were produced there.

  8. Hi Matt,

    a simple question. Currently they have a significant excess at about 125 GeV. E.g., in some other channels, they have some interesting fluctuations but not so significant to claim an excess. Is it possible that any of these will materialize as a further Higgs state as data accumulate?

  9. Given that I was only 12, I also missed CDF’s measurement of the Z mass. 🙂 My guess is they used muon decays only, which have a clear, linear momentum scaling based on curvature in the magnetic field from lower-pt muons (which could be calibrated from the j/psi) to higher momentum. Electrons showers would have too large a systematic (few GeV at least) when using a calorimeter shower tune based on test-beam electron showers, and the electron tracks radiate a lot of photons so are unreliable for measuring momentum. Anyways, I’d love to read the paper – do you have a link? (It’s not on the arXiv!)

      1. The authorlist fit on 1 page back then! Frightening though how little else has changed in the last 20 years (aside from luminosity – we now take 5/pb in 30 minutes!).
        Thanks for the link. My hunch was right that the best measurement came from Z->mumu, using only J/psi (and Ks) masses to calibrate the scale of the tracks. For electrons, they calibrated based on E/p in W->e events, using the known mass of the W, which is more of a cheat (measuring mZ/mW is less interesting than measuring mZ/m(J/psi)…).

  10. But no one was able to prove it to the required level of statistical certainty till 2012. The philosophical equivalent of ‘if a tree falls in the forest and no one is there to hear it does it make a sound?’.

    1. It’s a bit more like: a maple tree falls in a forest where dozens of pine trees are falling constantly. It’s not that nobody’s listening, but that it is extremely difficult to pick out the particular sound amid the din, especially since you don’t know exactly what it sounds like.

  11. Matt, are you saying that, for example, the event described by Dalitz and Goldstein in Physical Lett B 287 (1992) 225-230 as having mass
    131 +22 -11 GeV
    might really be Higgs
    Instead of dilepton Tquark candidate event ?

    1. There’s no connection. The dilepton T quark candidate has a lepton, an anti-lepton, a bottom quark jet and an anti-bottom quark jet (though the two can’t be distinguished experimentally), and something undetected, presumably neutrinos. A Higgs candidate event would have two bottom quark jets and either no leptons, or one lepton and something undetected, or a lepton-anti-lepton pair forming a Z particle candidate with nothing undetected. Furthermore, the 131 GeV mass obtained for a potential top quark is formed by combining a lepton, a bottom quark jet and a guess about a neutrino, whereas a mass for a Higgs particle comes from combining the bottom quark and antiquark jets. So no, there’s not likely any link between these.

  12. Then can we say that all chemical descriptions of molecules interactions thru photons are very very crude compared to the titanic reality of fields interactions ? can we ever hope to write the molecule field equation of a simple but huge molecule , say the titin ?

    1. Yes, but we approach things with crude methods for a good reason — the fine details don’t affect the result. This is why physicists are known to talk about “spherical cows”; a crude description of an object is often enough to capture its behavior. The complete description of the object may lead to results that are only slightly more accurate and precise but may require years of calculation on a giant computer instead of a half a minute on a small piece of paper.

  13. According to MY understanding based on what you said , any particle being an controlled oscillation in its field values cannot be free / independent / separated from its field , then what about molecules ? say DNA one , does the photon/ electron / quarks / gluons……. fields of all its atoms interact according to an equation — in principle – ? so every molecule have its own equation ? by that i imagine a super-complex equations describing the fields configurations for every molecule ……..does what i am saying describe any real situation ?

    1. The question of how one writes fields for complex objects built from simpler ones is one with a long answer. As objects become more and more complex, writing equations in terms of their fields becomes less and less practically useful, unless the complexities are irrelevant. And also, as special relativity becomes less and less important, the language of fields often becomes less useful. So in principle, yes, one could talk about fields for atoms, molecules and even more complex things, but in most cases of practical interest, doing so isn’t useful. Other methods are better for answering most interesting questions about molecules.

  14. Once a human consciousness thought about a so called “H” particle it was created, no matter if it exists or not, the creation is in the “mind”. It takes at this “invention” part of our system and will be very difficult to get lost of it.
    Wilhelmus

    1. Well, I guess it seems important to me to distinguish between those things that can only be created in the mind (such as humans that have wings and can see through rocks) and those that can be created in nature. Just speaking as a physicist.

      1. In a reductionist way scientists decided that the system of particles needed the goddamm particle and the field that goes with it to “create” mass, a kind of new eather, we can also accept that we just do not “yet” know it, occam’s razor indiquates a simpler “emergent” solution. People with wings that you are mentioning are an old thought of mankind, lso needed for explanation in times that mankind was not yet purely physically thinking, the H particle and its field were nececerry for scientists to account for the enormous sums of energy and money put in the experiments that would lead to its discovery, perhaps the true reason of mass is an emergent one and not a reductionist one. Wilhelmus

  15. “This surprised nearly everyone. [Unfortunately I was out of town and missed the scene of disbelief, back in 1989, when CDF dropped this bombshell during a conference at SLAC, the SLC’s host laboratory.]”

    This would be an interesting blog entry in its own right and possibly still relevant now 🙂

Leave a Reply

Search

Buy The Book

A decay of a Higgs boson, as reconstructed by the CMS experiment at the LHC

Related

About a month ago, there was a lot of noise, discussion and controversy concerning CERN‘s proposal to build a giant new tunnel and put powerful

POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 03/08/2024

Recently, a reader raised a couple of central questions about speed and relativity. Since the answers are crucial to an understanding of Einstein’s relativity in

POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 03/06/2024