Tag Archives: particle physics

The New York Times Remembers A Great Physicist

The untimely and sudden deaths of Steve Gubser and Ann Nelson, two of the United States’ greatest talents in the theoretical physics of particles, fields and strings, has cast a pall over my summer and that of many of my colleagues.

I have not been finding it easy to write a proper memorial post for Ann, who was by turns my teacher, mentor, co-author, and faculty colleague.  I would hope to convey to those who never met her what an extraordinary scientist and person she was, but my spotty memory banks aren’t helping. Eventually I’ll get it done, I’m sure.

(Meanwhile I am afraid I cannot write something similar for Steve, as I really didn’t know him all that well. I hope someone who knew him better will write about his astonishing capabilities and his unique personality, and I’d be more than happy to link to it from here.)

In this context, I’m gratified to see that the New York Times has given Ann a substantive obituary, https://www.nytimes.com/2019/08/26/science/ann-nelson-dies.html, and appearing in the August 28th print edition, I’m told. It contains a striking (but, to those of us who knew her, not surprising) quotation from Howard Georgi.  Georgi is a professor at Harvard who is justifiably famous as the co-inventor, with Nobel-winner Sheldon Glashow, of Grand Unified Theories (in which the electromagnetic, weak nuclear, and strong nuclear force all emerge from a single force.) He describes Ann, his former student, as being able to best him at his own game.

  • “I have had many fabulous students who are better than I am at many things. Ann was the only student I ever had who was better than I am at what I do best, and I learned more from her than she learned from me.”

He’s being a little modest, perhaps. But not much. There’s no question that Ann was an all-star.

And for that reason, I do have to complain about one thing in the Times obituary. It says “Dr. Nelson stood out in the world of physics not only because she was a woman, but also because of her brilliance.”

Really, NYTimes, really?!?

Any scientist who knew Ann would have said this instead: that Professor Nelson stood out in the world of physics for exceptional brilliance — lightning-fast, sharp, creative and careful, in the same league as humanity’s finest thinkers — and for remarkable character — kind, thoughtful, even-keeled, rigorous, funny, quirky, dogged, supportive, generous. Like most of us, Professor Nelson had a gender, too, which was female. There are dozens of female theoretical physicists in the United States; they are a too-small minority, but they aren’t rare. By contrast, a physicist and person like Ann Nelson, of any gender? They are extremely few in number across the entire planet, and they certainly do stand out.

But with that off my chest, I have no other complaints. (Well, admittedly the physics in the obit is rather garbled, but we can get that straight another time.) Mainly I am grateful that the Times gave Ann fitting public recognition, something that she did not actively seek in life. Her death is an enormous loss for theoretical physics, for many theoretical physicists, and of course for many other people. I join all my colleagues in extending my condolences to her husband, our friend and colleague David B. Kaplan, and to the rest of her family.

A Catastrophic Weekend for Theoretical High Energy Physics

It is beyond belief that not only am I again writing a post about the premature death of a colleague whom I have known for decades, but that I am doing it about two of them.

Over the past weekend, two of the world’s most influential and brilliant theoretical high-energy physicists — Steve Gubser of Princeton University and Ann Nelson of the University of Washington — fell to their deaths in separate mountain accidents, one in the Alps and one in the Cascades.

Theoretical high energy physics is a small community, and within the United States itself the community is tiny.  Ann and Steve were both justifiably famous and highly respected as exceptionally bright lights in their areas of research. Even for those who had not met them personally, this is a stunning and irreplaceable loss of talent and of knowledge.

But most of us did know them personally.  For me, and for others with a personal connection to them, the news is devastating and tragic. I encountered Steve when he was a student and I was a postdoc in the Princeton area, and later helped bring him into a social group where he met his future wife (a great scientist in her own right, and a friend of mine going back decades).  As for Ann, she was one of my teachers at Stanford in graduate school, then my senior colleague on four long scientific papers, and then my colleague (along with her husband David B. Kaplan) for five years at the University of Washington, where she had the office next to mine. I cannot express what a privilege it always was to work with her, learn from her, and laugh with her.

I don’t have the heart or energy right now to write more about this, but I will try to do so at a later time. Right now I join their spouses and families, and my colleagues, in mourning.

A Broad Search for Fast Hidden Particles

A few days ago I wrote a quick summary of a project that we just completed (and you may find it helpful to read that post first). In this project, we looked for new particles at the Large Hadron Collider (LHC) in a novel way, in two senses. Today I’m going to explain what we did, why we did it, and what was unconventional about our search strategy.

The first half of this post will be appropriate for any reader who has been following particle physics as a spectator sport, or in some similar vein. In the second half, I’ll add some comments for my expert colleagues that may be useful in understanding and appreciating some of our results.  [If you just want to read the comments for experts, jump here.]

Why did we do this?

Motivation first. Why, as theorists, would we attempt to take on the role of our experimental colleagues — to try on our own to analyze the extremely complex and challenging data from the LHC? We’re by no means experts in data analysis, and we were very slow at it. And on top of that, we only had access to 1% of the data that CMS has collected. Isn’t it obvious that there is no chance whatsoever of finding something new with just 1% of the data, since the experimenters have had years to look through much larger data sets? Continue reading

Breaking a Little New Ground at the Large Hadron Collider

Today, a small but intrepid band of theoretical particle physicists (professor Jesse Thaler of MIT, postdocs Yotam Soreq and Wei Xue of CERN, Harvard Ph.D. student Cari Cesarotti, and myself) put out a paper that is unconventional in two senses. First, we looked for new particles at the Large Hadron Collider in a way that hasn’t been done before, at least in public. And second, we looked for new particles at the Large Hadron Collider in a way that hasn’t been done before, at least in public.

And no, there’s no error in the previous paragraph.

1) We used a small amount of actual data from the CMS experiment, even though we’re not ourselves members of the CMS experiment, to do a search for a new particle. Both ATLAS and CMS, the two large multipurpose experimental detectors at the Large Hadron Collider [LHC], have made a small fraction of their proton-proton collision data public, through a website called the CERN Open Data Portal. Some experts, including my co-authors Thaler, Xue and their colleagues, have used this data (and the simulations that accompany it) to do a variety of important studies involving known particles and their properties. [Here’s a blog post by Thaler concerning Open Data and its importance from his perspective.] But our new study is the first to look for signs of a new particle in this public data. While our chances of finding anything were low, we had a larger goal: to see whether Open Data could be used for such searches. We hope our paper provides some evidence that Open Data offers a reasonable path for preserving priceless LHC data, allowing it to be used as an archive by physicists of the post-LHC era.

2) Since only had a tiny fraction of CMS’s data was available to us, about 1% by some count, how could we have done anything useful compared to what the LHC experts have already done? Well, that’s why we examined the data in a slightly unconventional way (one of several methods that I’ve advocated for many years, but has not been used in any public study). Consequently it allowed us to explore some ground that no one had yet swept clean, and even have a tiny chance of an actual discovery! But the larger scientific goal, absent a discovery, was to prove the value of this unconventional strategy, in hopes that the experts at CMS and ATLAS will use it (and others like it) in future. Their chance of discovering something new, using their full data set, is vastly greater than ours ever was.

Now don’t all go rushing off to download and analyze terabytes of CMS Open Data; you’d better know what you’re getting into first. It’s worthwhile, but it’s not easy going. LHC data is extremely complicated, and until this project I’ve always been skeptical that it could be released in a form that anyone outside the experimental collaborations could use. Downloading the data and turning it into a manageable form is itself a major task. Then, while studying it, there are an enormous number of mistakes that you can make (and we made quite a few of them) and you’d better know how to make lots of cross-checks to find your mistakes (which, fortunately, we did know; we hope we found all of them!) The CMS personnel in charge of the Open Data project were enormously helpful to us, and we’re very grateful to them; but since the project is new, there were inevitable wrinkles which had to be worked around. And you’d better have some friends among the experimentalists who can give you advice when you get stuck, or point out aspects of your results that don’t look quite right. [Our thanks to them!]

All in all, this project took us two years! Well, honestly, it should have taken half that time — but it couldn’t have taken much less than that, with all we had to learn. So trying to use Open Data from an LHC experiment is not something you do in your idle free time.

Nevertheless, I feel it was worth it. At a personal level, I learned a great deal more about how experimental analyses are carried out at CMS, and by extension, at the LHC more generally. And more importantly, we were able to show what we’d hoped to show: that there are still tremendous opportunities for discovery at the LHC, through the use of (even slightly) unconventional model-independent analyses. It’s a big world to explore, and we took only a small step in the easiest direction, but perhaps our efforts will encourage others to take bigger and more challenging ones.

For those readers with greater interest in our work, I’ll put out more details in two blog posts over the next few days: one about what we looked for and how, and one about our views regarding the value of open data from the LHC, not only for our project but for the field of particle physics as a whole.

Pop went the Weasel, but Vroom goes the LHC

At the end of April, as reported hysterically in the press, the Large Hadron Collider was shut down and set back an entire week by a “fouine”, an animal famous for chewing through wires in cars, and apparently in colliders too. What a rotten little weasel! especially for its skill in managing to get the English-language press to blame the wrong species — a fouine is actually a beech marten, not a weasel, and I’m told it goes Bzzzt, not Pop. But who’s counting?

Particle physicists are counting. Last week the particle accelerator operated so well that it generated almost half as many collisions as were produced in 2015 (from July til the end of November), bringing the 2016 total to about three-fourths of 2015.

 

The key question is how many of the next few weeks will be like this past one.  We’d be happy with three out of five, even two.  If the amount of 2016 data can significantly exceed that of 2015 by July 15th, as now seems likely, a definitive answer to the question on everyone’s mind (namely, what is the bump on that plot?!? a new particle? or just a statistical fluke?) might be available at the time of the early August ICHEP conference.

So it’s looking more likely that we’re going to have an interesting August… though it’s not at all clear yet whether we’ll get great news (in which case we get no summer vacation), bad news (in which case we’ll all need a vacation), or ambiguous news (in which case we wait a few additional months for yet more news.)

First Big Results from LHC at 13 TeV

A few weeks ago, the Large Hadron Collider [LHC] ended its 2015 data taking of 13 TeV proton-proton collisions.  This month we’re getting our first look at the data.

Already the ATLAS experiment has put out two results which are a significant and impressive contribution to human knowledge.  CMS has one as well (sorry to have overlooked it the first time, but it isn’t posted on the usual Twiki page for some reason.) Continue reading

The LHC restarts — in a manner of speaking —

As many of you will have already read, the Large Hadron Collider [LHC], located at the CERN laboratory in Geneva, Switzerland, has “restarted”. Well, a restart of such a machine, after two years of upgrades, is not a simple matter, and perhaps we should say that the LHC has “begun to restart”. The process of bringing the machine up to speed begins with one weak beam of protons at a time — with no collisions, and with energy per proton at less than 15% of where the beams were back in 2012. That’s all that has happened so far.

If that all checks out, then the LHC operators will start trying to accelerate a beam to higher energy — eventually to record energy, 40% more than in 2012, when the LHC last was operating.  This is the real test of the upgrade; the thousands of magnets all have to work perfectly. If that all checks out, then two beams will be put in at the same time, one going clockwise and the other counterclockwise. Only then, if that all works, will the beams be made to collide — and the first few collisions of protons will result. After that, the number of collisions per second will increase, gradually. If everything continues to work, we could see the number of collisions become large enough — approaching 1 billion per second — to be scientifically interesting within a couple of months. I would not expect important scientific results before late summer, at the earliest.

This isn’t to say that the current milestone isn’t important. There could easily have been (and there almost were) magnet problems that could have delayed this event by a couple of months. But delays could also occur over the coming weeks… so let’s not expect too much in 2015. Still, the good news is that once the machine gets rolling, be it in May, June, July or beyond, we have three to four years of data ahead of us, which will offer us many new opportunities for discoveries, anticipated and otherwise.

One thing I find interesting and odd is that many of the news articles reported that finding dark matter is the main goal of the newly upgraded LHC. If this is truly the case, then I, and most theoretical physicists I know, didn’t get the memo. After all,

  • dark matter could easily be of a form that the LHC cannot produce, (for example, axions, or particles that interact only gravitationally, or non-particle-like objects)
  • and even if the LHC finds signs of something that behaves like dark matter (i.e. something that, like neutrinos, cannot be directly detected by LHC’s experiments), it will be impossible for the LHC to prove that it actually is dark matter.  Proof will require input from other experiments, and could take decades to obtain.

What’s my own understanding of LHC’s current purpose? Well, based on 25 years of particle physics research and ten years working almost full time on LHC physics, I would say (and I do say, in my public talks) that the coming several-year run of the LHC is for the purpose of

  1. studying the newly discovered Higgs particle in great detail, checking its properties very carefully against the predictions of the “Standard Model” (the equations that describe the known apparently-elementary particles and forces)  to see whether our current understanding of the Higgs field is complete and correct, and
  2. trying to find particles or other phenomena that might resolve the naturalness puzzle of the Standard Model, a puzzle which makes many particle physicists suspicious that we are missing an important part of the story, and
  3. seeking either dark matter particles or particles that may be shown someday to be “associated” with dark matter.

Finding dark matter itself is a worthy goal, but the LHC may simply not be the right machine for the job, and certainly can’t do the job alone.

Why the discrepancy between these two views of LHC’s purpose? One possibility is that since everybody has heard of dark matter, the goal of finding it is easier for scientists to explain to journalists, even though it’s not central.  And in turn, it is easier for journalists to explain this goal to readers who don’t care to know the real situation.  By the time the story goes to press, all the modifiers and nuances uttered by the scientists are gone, and all that remains is “LHC looking for dark matter”.  Well, stay tuned to this blog, and you’ll get a much more accurate story.

Fortunately a much more balanced story did appear in the BBC, due to Pallab Ghosh…, though as usual in Europe, with rather too much supersymmetry and not enough of other approaches to the naturalness problem.   Ghosh also does mention what I described in the italicized part of point 3 above — the possibility of what he calls the “wonderfully evocatively named `dark sector’ ”.  [Mr. Ghosh: back in 2006, well before these ideas were popular, Kathryn Zurek and I named this a “hidden valley”, potentially relevant either for dark matter or the naturalness problem. We like to think this is a much more evocative name.]  A dark sector/hidden valley would involve several types of particles that interact with one another, but interact hardly at all with anything that we and our surroundings are made from.  Typically, one of these types of particles could make up dark matter, but the others would unsuitable for making dark matter.  So why are these others important?  Because if they are produced at the LHC, they may decay in a fashion that is easy to observe — easier than dark matter itself, which simply exits the LHC experiments without a trace, and can only be inferred from something recoiling against it.   In other words, if such a dark sector [or more generally, a hidden valley of any type] exists, the best targets for LHC’s experiments (and other experiments, such as APEX or SHiP) are often not the stable particles that could form dark matter but their unstable friends and associates.

But this will all be irrelevant if the collider doesn’t work, so… first things first.  Let’s all wish the accelerator physicists success as they gradually bring the newly powerful LHC back into full operation, at a record energy per collision and eventually a record collision rate.

Off to CERN

After a couple of months of hard work on grant writing, career plans and scientific research, I’ve made it back to my blogging keyboard.  I’m on my way to Switzerland for a couple of weeks in Europe, spending much of the time at the CERN laboratory. CERN, of course, is the host of the Large Hadron Collider [LHC], where the Higgs particle was discovered in 2012. I’ll be consulting with my experimentalist and theorist colleagues there… I have many questions for them. And I hope they’ll have many questions for me too, both ones I can answer and others that will force me to go off and think for a while.

You may recall that the LHC was turned off (as planned) in early 2013 for repairs and an upgrade. Run 2 of the LHC will start next year, with protons colliding at an energy of around 13 TeV per collision. This is larger than in Run 1, which saw 7 TeV per collision in 2011 and 8 TeV in 2012.  This increases the probability that a proton-proton collision will make a Higgs particle, which has a mass of 125 GeV/c², by about a factor of 2 ½.  (Don’t try to figure that out in your head; the calculation requires detailed knowledge of what’s inside a proton.) The number of proton-proton collisions per second will also be larger in Run 2 than in Run 1, though not immediately. In fact I would not be surprised if 2015 is mostly spent addressing unexpected challenges. But Run 1 was a classic: a small pilot run in 2010 led to rapid advances in 2011 and performance beyond expectations in 2012. It’s quite common for these machines to underperform at first, because of unforeseen issues, and outperform in the long run, as those issues are solved and human ingenuity has time to play a role. All of which is merely to say that I would view any really useful results in 2015 as a bonus; my focus is on 2016-2018.

Isn’t it a bit early to be thinking about 2016? No, now is the time to be thinking about 2016 triggering challenges for certain types of difficult-to-observe phenomena. These include exotic, unexpected decays of the Higgs particle, or other hard-to-observe types of Higgs particles that might exist and be lurking in the LHC’s data, or rare decays of the W and Z particle, and more generally, anything that involves a particle whose (rest) mass is in the 100 GeV/c² range, and whose mass-energy is therefore less than a percent of the overall proton-proton collision energy. The higher the collision energy grows, the harder it becomes to study relatively low-energy processes, even though we make more of them. To be able to examine them thoroughly and potentially discover something out of place — something that could reveal a secret worth even more than the Higgs particle itself — we have to become more and more clever, open-minded and vigilant.