An LHC Workshop in London

For me, one of the great pleasures of scientific research is the small, focused “workshop”, where a few scientists (typically 20 — 40 faculty, postdoctoral researches and students) assemble for a day or two to discuss a particular topic in detail. Presentations are typically short and often quite informal, and there’s lots of time for discussion built in to the schedule. The best workshops, of course, are those where the quality of the science and the knowledge, experience and skill of the participants is exceptional. I was fortunate to enjoy such a workshop yesterday at Imperial College in London, one of Britain’s finest scientific institutions.  About half the participants were experimentalists from the CMS experiment at the Large Hadron Collider (LHC) and the other half theorists like myself (one of whom is also on the ATLAS experiment at the LHC.)

Another great pleasure in doing science, at least in my field, is its international nature. The participants from yesterday’s workshop included scientists from at least seven countries and three continents, many of whom are now working in a country that is not their native land. Add to this diversity of cultures, appearances and accents some lively personalities, good senses of humor, and high quality science, and you have the ingredients for a very serious workshop that is a very serious form of fun, followed by much less serious fun when the workshop is over and everyone who can stick around goes out to a pub, a restaurant, and then another pub.

The workshop was focused on the searches at the CMS experiment for supersymmetry (a speculative but popular suggestion for what might lie beyond the known particles, and might be the solution to the hierarchy problem.) I should add that I haven’t written nearly enough about other possible solutions to the hierarchy problem, mainly for lack of time; they will come eventually.  I’ve focused on it because it’s been in the news at lot, because of various overstatements by physicists that have shown up on the blogs and various places in the media, saying that this summer’s results from the Large Hadron Collider show that supersymmetry is basically ruled out.   So far, every expert in the subject I have spoken to agrees with me (and I would consider myself only a moderate expert) that this is simply not the case. There are huge loopholes (learn about a few of them here) in the claim that “supersymmetry is virtually ruled out” that you could drive a supersymmetric truck through, and there’s lots of work left to do if we are to assure ourselves that neither supersymmetry, nor anything like it, is in LHC data.

Discussions at yesterday’s workshop were focused on understanding what has and hasn’t yet been looked for in the search for supersymmetry — where are the largest and most compelling loopholes, and how do we start to close them?  Here’s the problem: we have a vast continent of variants of supersymmetry (and other models that might give similar experimental signals) that we have to search for; we have only a few thousand experimentalists who can form a few dozen search parties; so we need to know: what are the biggest and most fertile unexplored regions, and what are the most efficient search strategies?

The workshop opened with a couple of talks by two experimentalists from CMS, the first addressing the question “what have we done so far?” and the second “ what are the issues/concerns for the future?”. The rest of the short talks came from theorists, with very diverse and complementary points of view.   I personally found the discussion and exchange of information especially profitable. The challenge now is to convert this exchange of information into a proposed strategy. That requires follow-up and a certain amount of organization, not a simple matter when everyone is so incredibly busy — and this will be my focus for today, as I remain in London.

Needless to say there was some conversation after the workshop about the OPERA experiment and its claim of early-arrival neutrinos.   The level of skepticism is extremely high.  I’ll post about that separately.

6 responses to “An LHC Workshop in London

  1. What is the name of the conference & can you post a link? I couldn’t find it via Google. It’s ~3am here (2 block from Caltech)

    I’ve been “up” for nearly 7 straight days (on computer, working until I doze off) in search of a systematic error (non-obvious) for OPERA. Between me & couple of FPGA (Field Programmable Gate Array) specialists (Caltech PhD & xxx) & Ethernet expert (Honeywell/Advanced Technology Engineering), there could be a time-stamping snafu in the TT (target Tracker)/FPGA. I.e., more “sigma” (uncertainty) than modeled/stated in the paper, which could invalidate the 6 sigma claim. We’re in touch with OPERA co-authors, trying to determine the accuracy 100Mhz clock on FPGA (10 nsec period, the basis for the time-stamp), synthesized via PLL (Phase Locked Loop) from the OPERA master clock (derived from GPS atomic clock). PLL has these 2 “uncertainties”: static phase-offset (what model?) & phase-noise (probabilistic model, w/sigma)..NOT mentioned in the OPERA paper. This COULD be the source of systematic error — jitter in the PLL sourced 100 Mhz clock (reference for time-stamping).

    The ECM (Ethernet Controller Module) board contains the FPGA, which is an implementation of the “Network [ Ethernet ] Architecture for DAQ [ Data Acquisition ]“. The latter being a new generation architecture, going beyond the VME (Versabus Module Europa, circa late 80′s!!) based DAQ used in LEP-LHC (1980 – 2007). I went as far as to check the Japanese Super-Kamiokande neutrino experiment, & Fermilab/MINOS — they show a VME bus in DAQ, so it appears as if they DON’T use the above DAQ (Ethernet/Network based) architecture. This might correlate why OPERA is seeing super-luminal neutrinos & 2 other neutrino experiments AREN’T. I’ve already emailed the MINOS team (UCL/Univ College London, Caltech, Harvard collaboration members), about what DAQ they’re using — no response yet. About to send email query to Super Kamio-kande collaborators.

    Bottomline: the OPERA DAQ system is fairly new (commissioned in summer 2006), & may be buggy. It’s using an industrial grade hardware for a high-end Physics application (requires nsec accuracy). I found a paper by an OPERA co-author about the OPERA DAQ, where it states:

    “not much user experience compared to VME [ used in LHC & others ]”
    “main targets are large telecommunication companies, not universities [ OPERA collaborators ]”
    “where do physicists meet industry? Towards physics compatible std”

    I get this feeling it COULD be a case of “wrong tool for the wrong job”:

    “I keep telling you. The Right Tool for the RIGHT JOB”
    – Scotty/engineer, “Star Trek”

    Stay tuned..

  2. People making a living in the field are not necessarily subjective when it comes to making a judgement of when its time to put supersymmetry on the back burner. This is aggrevated by the fact that there are no real alternatives to turn to – this is why theres so much interest in Opera – though most people dont really believe it. The trouble is that physics has been in the doledrums as far as successful new ideas are concerned since the early severnties, and the likely imminent failure of the Higgs scenario, which dates back from the sixties, shows that even that is too optimistic. It is beginning to look like , gravity and QFT are just not unifying (without suppersymmetry and a plethora of particles) and with the Higgs in trouble, even what we have thought we had unified appears to be coming apart. I suspect the two failures are not unrelated.

    To help demonstrate how difficult the situation is, using something familiar to those acquainted with rudimentary quantum mechanics, consider the early 20′th century before quantum mechanics was invented. One knew light is emitted in quanta, Einstein (a subsequent opponent of QM) even proposed that the electric field itself is quantized (photons), they also knew that the Rutherford model of the atom with electrons orbiting the nucleus conflicted with Maxwell’s theory that accelerating charges should radiate – so the planetary atom was unstable. Imagine experimental input then suddenly coming to an end, with scientists left for a century, just on the back of that, to explain atomic structure – no emission spectra, no Zeeman effect, no new particles, nothing else. One can imagine all sorts of mechanical and statistical systems being invented with ever more comlicated interactions – because that is all people knew, and the only things that had worked before. But nobody in their right mind would have come up with quantum mechanics, to wit, that below this world of order there is a huge lotery going on. Today the situation is that to do just one experiment, of predictive value perhaps even less than a simple Zeeman splitting measurement, one requires a world collaboration such as the LHC, using money that perhaps never again will be available if nothing is found. One can see the scale of the problem there.

  3. The first line above should read ‘objective’ not ‘subjective’ – its a pity I can’t edit this thing.

    I should like to mention one thing in connection with what I said above. The main problem plaguing all major theories devised since the seventies is a plethora of extra particles which these theories predict in addition to those that are actually known to exist. This applies in equal measure to variations of technicolor, supersymmetry, and superstrings. This is actually indicative of the excessive numbers of degrees of freedom present in those systems. It is relatively easy to fit an experiment to a theory with the number of degrees of freedom exceeding the number present in the experiment – but the fact that that’s all you have done is reflected in the extra degree’s of freedom left over – as such the extra particles should be welcomed not loathed. Nature has been quite sparse in the number of degrees of freedom it uses. So I think we will know we have come up with the right unification theory when using the least parameters we predict the universe and no more.

    • People used this argument in the past to predict there would be no neutral currents (i.e. no Z particle) and were wrong. They used it to predict that there would be no neutrino masses, and were wrong. Nobody expected the muon. In fact, back in the 1920s there was no reason to expect the neutron. And in the 1800′s, someone taking your point of view would have failed to predict all the as-yet unknown elements in the periodic table. I don’t think history agrees that taking a minimalist approach is always a good idea. Sometimes it is, sometimes it is not.

  4. I dont know why you say that, I have a good track record predicting that the LHC results were not as good as they looked in July – before it became clear in August, as well as questioning the results of Opera on your blog before everyone else joined in. I think the issue with supersymmetry is not whether it has been ruled out – it is likely it will never be – but whether it is relevant. The latest results from the LHC are a serious blow to supersymmetry in that aspect. They indicate that, unlike what had been hoped for, supersymmetry can not explain EW unification.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s