During the gap between the first run of the Large Hadron Collider [LHC], which ended in 2012 and included the discovery of the Higgs particle (and the exclusion of quite a few other things), and its second run, which starts a year from now, there’s been a lot of talk about the future direction for particle physics. By far the most prominent option, both in China and in Europe, involves the long-term possibility of a (roughly) 100 TeV proton-proton collider — that is, a particle accelerator like the LHC, but with 5 to 15 times more energy per collision.
Do we need such a machine?
The answer is “Yes, Definitely”. Definitely, if human beings are to continue to explore the inner world of the elementary laws of nature with the same level of commitment with which they explore the outer world of our neighboring planets, the nearby stars and their own planets, and distant galaxies far-flung across the universe. If we can send the Curiosity rover to roam around the surface of the Red Planet and beam back pictures and scientific information — if we can send telescopes like Kepler into space whose sole purpose is to look for signs of planets around distant stars — then surely we can build a machine on Earth whose sole purpose is to help us understand the fundamental principles and elementary objects that underlie the natural world. That’s why we built the LHC, and machines before it; and the justification for a 100 TeV machine remains the same.
Definitely, also, if the exploration of the laws of nature is to continue as a healthy research field. We have a large number of experts who know how to build a big particle accelerator. If we were to postpone building such a machine for a generation, we would suffer some of the same problems suffered by the U.S. space program. All sorts of crucial knowledge of the craft of rocket building was lost when the U.S. failed to follow up on its several trips to the Moon. If we have a hiatus of a generation between the current machine and the next, we will find it much more difficult and expensive to build the next one when we finally decide to do it. So it makes sense to do maintain continuity, especially if it can be done at reasonable cost.
One thing that’s interesting to keep in mind is that a roughly 100 TeV machine is hardly a stretch for modern technology; it’s not going to be a machine with a significant risk of failure. The Superconducting SuperCollider (SSC), which was to be the U.S. flagship machine and was due to start running in the year 2000 (in which case it would definitely have discovered the Higgs particle many years ago — sadly, the U.S. congress canceled it, after it was well underway, in 1993), would have been a 40 TeV machine. The technological step from 40 TeV to 80 or 120 is not a big one. Moreover, the SSC would have been an easier machine to run than is the LHC, which has to strain with very high collision rates to make up for the fact that its energy per collision is a third of what the SSC would have been capable of. The main challenge for such an accelerator is that it has to be very large — which requires a very long tunnel (over 50 miles/80 km) and a very large number of powerful magnets.
It’s no wonder the Chinese are interested in potentially building this machine. With an economy growing rapidly enough to catch up with the other great nations of the world in the next decade or two, and with scientific prowess rapidly increasing (see here and here), some in China rightly see a 100 TeV proton-proton collider both as an opportunity to gain all sorts of technical and technological knowledge that they have previously lacked, and to establish themselves among the few nations that can be viewed as scientific superpowers. Yet it will not require them to go far out on a limb with technology that no one has ever attempted at all, and invent whole new methods that don’t currently exist. Moreover, some of the things that would be expensive or politically complex in the U.S. or Europe will be easier in China. They may be able to pay for and construct this machine themselves, with technical advice and personnel from other countries, but without being dependent on other nations’ political and financial challenges.
In fact, there’s another huge potential benefit along the way, even before the 100 TeV machine is built: a “Higgs factory”. One can potentially use this same tunnel to first build an accelerator that smashes electrons and positrons [i.e. anti-electrons] together, at an energy which isn’t that high, but is sufficient to make Higgs particles at a high rate — not as many Higgs particles as the LHC will produce, but in an environment where precise measurements are much easier to make. [Protons are messy, and all measurements in proton-proton collisions are very difficult due to huge collision rates and large backgrounds; electrons and positrons are simple objects and measurements tend to be much more straightforward. This comes at a cost: it is harder to get collisions at the highest energies physicists would ideally want.]
The value of a Higgs factory is obvious: a no-brainer. The Higgs particle is our main way of gaining insight into the nature of the all-important Higgs field, and moreover the Higgs particle might also, through its possible rare decays, illuminate a currently veiled world of unknown particles and forces. It’s a research effort whose importance no one can deny, and it serves as a technical stepping stone to a 100 TeV collider, complete with the realistic possibility of Nobel Prize-worthy discoveries in the near term. For China, it’s perfect.
Of course, the Chinese aren’t the only ones interested. My European colleagues, recognizing a good thing when they see it, and with the advantage that they built and ran the LHC, are also considering building such a machine. [Neither the U.S., which is expertly squandering its scientific leadership in many scientific fields (and pushing many of its best scientists toward the Chinese effort), nor Russia, which is busy starting a disastrous invasion of its neighbor, seem able to make any intelligent decisions at the moment, and surely aren’t going to be the leaders in such an effort.] For the moment, the scientists involved are all working together. Over recent years, any particle physicist worth his or her salt (including me) would spend some time at Europe’s CERN laboratory, which hosts the LHC. And now, many young U.S. experts in theoretical particle physics are planning to spend extended time at China’s “Center for the Future of High Energy Physics“. There was a time young Chinese geniuses like T.D. Lee, C.N. Yang and C.S. Wu did Nobel Prize-winning (or -deserving) work in the United States. Soon, perhaps, it will be the other way around.
But what, scientifically, is the justification for this machine?
Why build a 100 TeV collider?
It’s important to distinguish two types of scientific enterprises: exploratory and targeted. Exploratory refers to when you’re doing a search, in a plausible place, for anything unexpected — perhaps for something whose existence you might suspect, but perhaps more broadly. Targeted refers to doing a search or study where you know roughly, or even exactly, what you’re looking for.
Often a targeted enterprise is also exploratory; while looking for one thing, you can always stumble on something else. Many scientific discoveries, such as X-rays, have been made while doing or preparing experiments with a completely different purpose. On the other hand, an exploratory enterprise may not have any targets at all, or at best, only a very vague target. Sometimes we go searching just because we can. When Galileo pointed his first telescopes at the moon and the planets and the stars, he had no idea what he would find; he just knew he had a great opportunity to discover something.
The LHC was built as a clearly targeted machine: its main goal was to find the Higgs particle (or particles) if it (or they) existed, or whatever replaced them if they did not. Well, now we know that one Higgs particle exists, and it resembles the simplest possible type of Higgs particle, which is termed a “Standard Model Higgs”. But much remains to learn. Is this Higgs particle really Standard Model-like, not just at the 30% level but at the 3% level and better? Are there other Higgs particles? Are there other as-yet unknown particles being produced at the LHC? Are there new forces beyond the ones we’re aware of? Other than the detailed study of the new Higgs particle, these questions are mostly exploratory. In short, though the LHC was built as a targeted machine with a near-guarantee of success, its mission has now shifted toward exploration of the unknown, with no guarantee of further discoveries. But it’s also important to understand that a lack of discoveries will be just as important to our understanding of nature as discoveries would be, for reasons I’ll return to in my next post.
Now what about the 100 TeV machine? Will it be a targeted experimental facility, or an exploratory one?
For the moment, the answer is: we don’t know. Currently, there is no clear target; more precisely, there are lots of possible targets, but none that we know could emerge to be a major, central one. But this machine won’t be built and completed for a couple of decades, and things could change dramatically by then. If the LHC discovers something not predicted by the Standard Model (the equations used to describe the known elementary particles and forces), then clarifying this new discovery will become a major target, and possibly the main target, of the 100 TeV machine.
This highlights one of the challenges with large experimental projects. One has to start thinking about them far in advance, long before it’s entirely clear what their precise use will be. When the SSC and the LHC were first proposed, they did have a proposed target — finding the Higgs particle or particles. But if the recently discovered Higgs particle’s mass had been, say, half of what it actually is, it would have been discovered some years before the SSC or LHC were completed… in which case, the target of the SSC and LHC would have significantly shifted. So we have to start considering, proposing, and perhaps even building the 100 TeV machine before it’s completely clear whether it will have a prominent and definite target, or whether it will be mainly exploratory. That ambiguity is something we just have to live with.
In contrast to the 100 TeV machine, which currently has to be viewed as exploratory, the Higgs factory that would precede it in the same tunnel is much more sharply targeted… targeted at detailed study of the Higgs particle. There are some other targeted and exploratory activities that it can be involved in, including more detailed investigation of the Z particle, W particle and top quark, but its main focus is the Higgs.
However, even if no prominent target for the 100 TeV collider shows up before it is built, its justification as an exploratory machine is clear. In quantum field theory, collisions at higher energy and momentum allow you to probe physics at shorter times and distances — for “particles” are really quanta, i.e., ripples in quantum fields, and a higher-energy quantum has a shorter wavelength and a faster frequency. And we’ve learned time and time again that one way (though not the only one) to discover new things about the world is to examine its behavior on shorter times and shorter distances than we’ve previously been capable of. This enterprise has been going on for generations; first microscopes discovered bacteria and other cells; then these were found, with more powerful experiments, to be made of molecules, in turn made from atoms; yet more powerful experiments showed first that the atoms contain electrons and atomic nuclei, then that the nuclei are made from protons and neutrons, and then that these in turn are made from quarks and gluons. All of this has been discovered by probing the world with ever more powerful particle collisions of one form or another. So building a higher energy accelerator is to take another step along a well-trodden path.
However, it’s not the only path, nor has it ever been.
Is this the most promising path to explore?
The LHC is still in its adolescence, and we can’t predict its future discoveries. At this point the LHC experiments have collected a few percent of the data they’ll collect over the next decade, and they have done so with proton-proton collisions whose energy is only about 60% of what we expect to see in the next few years. Moreover, even the existing data set, collected in 2011-2012, hasn’t been fully analyzed; this data could still yield discoveries (but only if the experimenters choose to make the relevant measurements.) So we certainly can’t know yet whether the LHC will produce a new target for the 100 TeV machine. If it does, then it will be much clearer what to do next and how to use the 100 TeV machine. If it doesn’t… well, that’s something that deserves a bit more discussion.
Suppose that, after the LHC’s last run, nothing other than the Higgs particle’s been found, with properties that are consistent, to a few percent, with a Standard Model Higgs. While this sounds dull at first glance, it’s actually among the most radical possible outcomes of the LHC. That’s because of the “naturalness puzzle”, which I discussed in some detail in this article. Never before in nature, in any generic context, have we come across a low-mass spin-zero particle (i.e. something like the Higgs particle) without other particles associated with it. In this sense, the Standard Model is an extraordinarily non-generic theory, at least from our current point of view and understanding. It will be quite shocking if it completely describes all LHC data.
But maybe it does. If it does, what does this potentially imply about nature? And what would be the implications for our future explorations of nature at its most elementary level? I’ll address this issue in my next post.
57 thoughts on “A 100 TeV Proton-Proton Collider?”
Building such a machine is utterly total waste of time and money. Luckily we have couple of years time to come into our senses before any serious money is wasted. I have started a collaboration which tests my antimatter hypothesis and after that, larger particle colliders are pretty much superfluous. Sorry about that 😉
Whew! It’s a good thing that would-be Einstein’s like yourself don’t make decisions about scientific research! Sorry about that. 😉
We’ll see about that…
Could Higgs particles from Higgs factory prevent aging? – preventing increase of entropy in chemical reactions. Lose of thermal energy for conversion into mechanical work means, lose of mass. If the electrons responsible for chemical reactions fed with “mass” against increase in entropy ? 😀
“All sorts of crucial knowledge of the craft of rocket building was lost when the U.S. failed to follow up on its several trips to the Moon.”
It was worst than that. The Apollo generation of NASA engineers were fired en masse (almost literally decimated) towards the end of the Nixon administration. There seemed to be a grim determination that there would be no more space spectaculars to spend money on. I remember going to the Cape for a launch, circa 1975, and having an Apollo PhD pumping my gas; he said it was the only job he could get after the mass lay-offs.
If you go to a big US conference today focused on lunar exploration, the bimodal distribution of the attendees is striking (lots of young people, a handful of old men, few in the middle). At the GLEX 2012 meeting in DC, that became almost darkly comic, as I saw a series of 30-something scientists describe this or that plan for lunar power generation, or astronomical observations, or whatever, and each time an older gentleman would stand up after the talk to describe how they did the same thing in 1969 or 1971, but in more depth, and avoiding this or that problem that the younger guy needed to know about.
Interestingly, scientific meetings in Russia and China also tend to be bimodal in almost exactly the same way, but for drastically different reasons. In China, it was the Cultural Revolution. (All of those Red Guards waving Little Red Books in 1969? They never got a real education, and today are still out on some farm feeding the pigs.) In Russia, it was the insanely disastrous economic advice of Jeffery Sachs and the “Chicago boys,” which drove a generation of PhDs to places like Phoenix, Dresden, St. Louis and Dwingeloo (to name the destinations of some of my Russian colleagues).
I really do not know what to make of these different systems coming to the same bad end, but I do know that the US space program has never really recovered from the Apollo firings, that senior people in Russia worry the same about their space program, and that if this could be avoided for particle physics, it should be.
I appreciate everything you said except “almost literally decimated.” 🙂
decimated: historical definition: kill one in every ten of (a group of soldiers or others) as a punishment for the whole group
Both better (nobody died) and worse (much more than 10% lost, I suspect.)
Hmm, I always thought decimated meant “remove 9 in 10” but I see you are right, and it’s 1 in 10. Of course, I did not mean in the old sense of executions of those selected. Lives may have been ruined, but they were not destroyed.
And, yes, the shuttle did start ramping up within a relatively short time after the Apollo firings, but not simultaneously, and something was lost that has never been retrieved.
Indeed it was more than 10% of professionals laid off when the Apollo program was cancelled.
As NASA was really scared of the soviets spying into the Saturn V rocket, they did not keep blue prints of the whole thing, (the only real “info” they got were the actual rockets) and they decided very early into the project the use of a compartmentalized approach (related to the very same worries about spying) to how knowledge of the technology and the hardware and software was handled.
When the program was cancelled, very soon the knowledge was lost.
This is one of the reasons why the Orion/Ares program had to start from scratch.
Besides, NASA was focused on some other technologies (Skylab, the shuttle, the space station, which means earth orbit technologies), so, there was not a lot of reuse of knowledge that would benefit from keeping the Apollo engineers.
Kind regards, GEN
Actually, a lot of Saturn V technology did survive. See “In Search of Ancient Engines” in: http://cd.textfiles.com/spaceandast/TEXT/SPACEDIG/V13_3/V13_349.TXT
That is usually the case when humans explore/settle a new frontier. It was 80 years or so between Columbus and St. Augustine. It took 30 years or so between Lewis and Clark and the opening of the west in a big way. Space will be no different.
IMHO, the real tragedy was in the 90’s when the SDIO Single Stage Rocket Technology program was canceled. That program produced new innovative solutions and might have had a huge impact had the next phase been funded.
As far I as understand it, CERN wants to do an international, i.e. global, project and most of the work the project does should be usable regardless of where the “FCC”, or whatever the name will be, will be built.
For obvious reasons, CERN considers itself one of the best locations for this new ring. Which is the prefer design, but not the only one…
I am not an expert on engineering but from what I heard there are several issues that need to be solved first, because some of the technologies needed, do currently not exists – namely magnets with 18T is out of reach by today technical possibilities.
Data might also be an issue because it might just be too much to handle with current technologies.
Remember the first machine to be put in the tunnel would be the Higgs factory, so there’s quite a bit of time to develop the magnet technologies and benefit from data storage and computation improvements.
Proton-proton collisions and even electron-positron collisions plan for Fermilab seem archaic unless the math clearly indicates that we can achieve, release, the lightest (actual transition point with the vacuum and potentially “measuring” supersymmetry particles) non SM Higgs, (or Higgs-like), particle.
Black holes are the key to “completing” the physics, IMHO. I believe the gravitational field to be fundamental, the one from which all other fields are composed, variables being the geometries of the oscillations and sizes of the “confined” spaces associated with each type of field. Black holes are regions in space where the gravitational field achieves its maximum (or some threshold) which could be breaking down (cancelling) the coupling forces inside hadrons, g-factors. Hence, could be scrambling matter down a notch to “dark”, where the frequencies cannot be seen or measured by us or the sensors we have thus far.
So, IMHO, if we are to make any progress, we need to create and contain a black hole, [and I mean CONFINE!, :-)]. Achieving the limits of the highest intensity, density, of the gravitational field is the only way to reach supersymmetry, if indeed it is reality. I do not believe a collision between particles, even the smallest fermions, electron-positron, can achieve “spins” at collision to achieve the maximum limit of quantum gravity.
The problem with the SSC was they started building it. As long as everybody in Congress thought it would be in their state, it was popular.
What the physics communnity should have done was get behind the Super Duper Collider (SDC) instead. It was a series 3000 mile linear accelerators that went through every congressional district in the lower 48.
“All sorts of crucial knowledge of the craft of rocket building was lost when the U.S. failed to follow up on its several trips to the Moon.”
That was actually not the case. The Apollo program was cut short by Nixon (in whose lap it had fallen) in order to build the Space Shuttle – and work (and spending!) for the Shuttle started immediately, if I am not mistaken.
The problem with the Apollo->Shuttle transition was a different one IMHO: Instead of taking an evolutionary step (and developing from what was designed for Apollo) almost everything was developed anew, almost nothing was reused. In my maybe not so humble opinion the Shuttle was a giant pork barrel handed out to keep jobs.
A lot of resources were spent after Apollo on human spaceflight, but not in a good way IMHO. There were quite a few expensive design decisions made specifically for the Shuttle (a topic of its own), I think a lot more could have been done with the resources that were spent for the Shuttle.
If anything, the Shuttle shares some similarities with the SSC: A lot of money handed out, a lot of stuff done, a lot of people being kept busy – but not in a optimal way.
The Ssc was a very intelligently planned machine with a very sharp physics application and goal. The space shuttle was essentially pork and had no real raison d’être. Both had execution problems, but the real tragedy was the loss of all that talent in both the Apollo and Ssc programs. The US never really recovered. A similar story happened with our television industry.
It is not proper to say that the Space Shuttle did not have a “real raison d’être”. It actually did, from the very beginning: to be a reusable workhorse for an exclusively U.S. space station, which was the stepping stone for a manned mission to Mars.
There were many intertwined issues that conspired against the space station project, which was cancelled, and then it became proper to say that the shuttle had no specific reason to exist, but the escallation of the Cold War during the Reagan era was very useful in finding many alternative uses for the shuttle (for instance, it was a valid testing prototype for the completely automated X37-B, that is a secret project from the U.S. Air Force).
Once the international space station became a viable project, the shuttle was already available and ready to fit into its natural place within that project.
>to be a reusable workhorse for an exclusively U.S. space station
No, it was started long before the US Freedom Station. It did however have a reason: make access to space cheap and routine. Sadly, it failed miserably on both counts.
On the plus side, there are a lot of startups working now on better approaches. May gained experience from projects in the 80’s and 90’s and learned the rite lessons.
I have of late read many different ideas for future colliders – proton/proton, proton/antiproton, electron/positron and even muon/antimuon… I have read of site locations at CERN, in Japan and China…
I wish all of them could be built. I know that is not possible. I do however think that after careful study of best type of machine to build at the most suitable location – we should build something to replace the LHC in the 2020’s. (Personally I like the idea of a double-collider; electron-positron and proton-proton to be constructed in a new CERN tunnel…)
As to ‘justification’ I think it was the artist Peter Max who simply said: “Man Must Moon.”
I also like two classic quotes are attributed to Faraday:
Whilst attempting to explain a discovery to either Gladstone (Chancellor) or Peel (Prime Minister) he was asked, ‘But, after all, what use is it?’ Faraday replied, ‘Why sir, there is the probability that you will soon be able to tax it.’
When the Prime Minister asked of a new discovery, ‘What good is it?’, Faraday replied, ‘What good is a new-born baby?’
Are there feasible options for extraterrestrial accelerators, say, orbiting the Earth, Sun or Moon, or perhaps on the surface of the Moon?
I realize that a harder volume than space is required, but space is a lot closer to the cold we need for magnets and the vacuum we need for beams. Perhaps clever engineering could produce a pipe that handles less stress than on earth.
How hopeless is that?
Hopeless for now; ignoring all the issues you raised, you’d still need huge magnets to steer the particles, and the cost of lofting such enormous magnets into orbit would be insane.
Never before in nature, in any generic context, have we come across a low-mass spin-zero particle (i.e. something like the Higgs particle) without other particles associated with it. In this sense, the Standard Model is an extraordinarily non-generic theory, at least from our current point of view and understanding.
I guess I don’t really understand the rationale for applying our past experience with spin-zero particles to the Higgs particle, at least without major reservations. Before the Higgs, no elementary spin-zero particle has ever been observed, of any mass. Since the spin-zero mesons (for example) are composite, not elementary, it seems reasonable to me that other composites of the same constituents of those mesons should be found. But apparently the Higgs is not composite, nor otherwise derived from more elementary phenomena as in the case of condensed matter systems where spontaneous symmetry breaking occurs. Naively, aren’t the differences between the Higgs and other spin-zero particles basic enough to question the reliability of our past experience, i.e., that we should find other particles associated with the Higgs?
I know I’m revealing my ignorance here, but I want to learn.
This is a subtle point. You’ve flipped the logic.
When you say, “no *elementary* spin-zero particle has ever been observed”, that’s precisely one part of the point. Exactly: no elementary spin-zero particle has ever been observed before. Moreover — and this is the other part of the point — we think we have a profound understanding as to why. The understanding IS technical, but I’ve tried to explain it in my naturalness articles, http://profmattstrassler.com/articles-and-posts/particle-physics-basics/the-hierarchy-problem/naturalness/ If you ask for generic quantum field theory equations, they will never give you an interacting low-mass elementary spin-zero particle without other particles.
“Naively, aren’t the differences between the Higgs and other spin-zero particles basic enough to question the reliability of our past experience, i.e., that we should find other particles associated with the Higgs?”
Of course one can question our past experience. But there’s no example of any type of theoretical equation that anyone has ever written down, after four decades of work, that gives an elementary Higgs with no other particles. So if you question the logic, you are questioning quantum field theory itself. And of course you’re free to do that too. But now you have a problem. The Standard Model is a quantum field theory, and it’s giving one successful prediction at the LHC after another. Thus: we have a puzzle. How can you question the logical arguments learned from quantum field theory, and yet avoid coming into conflict with existing experimental data that quantum field theory so beautifully matches? That is what many of us are thinking about right now.
Matt, this is really inspiring to me, thanks! I start school at the end of the month for physics and I’ve been spending a lot of time thinking about what area I could get into based on interest & job security. Neutrino astronomy probably has minimal security (but is awesome), accelerator physics sounds useful and secure, solid-state physics and optics sound like they’re useful/secure, and wall street apparently needs a bunch (but no thank you). I dunno, I just want to contribute to the scientific endeavor while supporting myself so I thank you for posting things like this that can guide my efforts.
There’s a lot of nerve-wracking commentary online about PhD physicists not doing physics because our country has found a way to make it difficult for them to find jobs. That’s a bad sign so I hesitate to just get into what I think is cool, it has to be secure or I’ll waste my time (and should go into engineering where I wouldn’t be remotely as into it, although accelerator physics and electrical engineering overlap). Anyways I guess I should just informally request that perhaps you do an article on the state of physics today and how to break in (where the current work is at), college advisors don’t necessarily have the best info or advice in my experience. Keep up the awesome articles, I have yet to be anywhere near bored by one.
I agree with Matt a 100%. Even if today we do not have a proper purpose for the Next Gen Machine (the “around” 100 TeV collider), we should definitively start working on its planning and general design.
If I were to characterize the main aspects of our species, AKA human beings, I would include (way up at the very top) the fact that we are animals that have a natural tendency and capacity to do engineering to survive, to look for better chances to make it through the next day unscathed, if possible.
When I say that we are animals that do engineering to survive, I mean that we imagine and build tools for that general purpose. Whether it is a spear or a proton-proton collider, they are just tools, and it requires science and engineering to do that, the larger and more difficult the tool, the more science and engineering it will require.
We already have a lot of evidence that points toward the fact that we need to support the scientific effort on an unstopping and unwaivering fashion. This is something that we cannot afford the luxury of not doing. Our own chances to survive depend on this very fact.
I really enjoy reading your website. I have spent hours engrossed in some posts. Your naturalness post was the clearest I’ve ever read about this subject. I taught Conceptual physics and a lab for it at a Liberal Arts college for many years and I wish I’d have known of this site or had it available when I taught. It is a wonderful resource for anyone interested in physics.
I understand it’s a high-risk proposition, but if I were “in charge” I’d start a new Manhattan Project of sorts to build a muon collider. Higgs factory, but more energetic than proposed linacs, and a neutrino factory to boot. There’s more to thinking big than building big.
Matt, you say that studying the Higgs particle decays could lead to new physics.
Is this something special for the Higgs field, i.e. does the Higgs have a special place in current theories compared to other particles/fields (e.g. quarks, W/Z bosons), or are the Higgs particle decays important now because all the other known particles have already been studied in much better detail?
The answer is: both.
Yes, the Higgs particle (in this mass range) is especially sensitive to new phenomena: http://profmattstrassler.com/articles-and-posts/the-higgs-particle/the-standard-model-higgs/lightweight-higgs-a-sensitive-creature/
Other particles have been well studied by comparison. The most sensitive after the Higgs that haven’t yet been fully mined for information would be the Z, the W and the top quark. Studies of these particles are ongoing at the LHC and will also occur at a Higgs factory. But right now the Higgs is so poorly known that there is a lot of room for big discoveries.
Besides all this, the theory of cosmic inflation involves a field that powers it. As it is predicted by that theory, the inflation field has the just about the same properties that the Higgs field has, so, if this is validated with experiments in the future, we could expected to find either more properties and forces associated with the Higgs boson, or more particles related to the Higgs field, besides the boson already found, or any combination of the above.
Kind regards, GEN
While particle physicists would like powerful particle colliders at their disposal I am of the opinion that the truth is out there, that is, in precision cosmology where 95% of the material making the cosmos still remains to be discovered. Given the right to vote for funding I would advocate funding for DE and DM probes. Imagine the possibilities if we could harness Dark Energy!
1) You can’t harness dark energy. Forget that idea.
2) No one in the high-energy physics community would argue against trying to understand dark energy and dark matter. But right now, these are still quite unknown. Any effort to understand them will therefore be exploratory, or, if targeted, very doubtful of success. (We’ve been seeing with the many dark matter searches being undertaken.) So it’s not a simple matter of just throwing money at the issue. We may have to guess what dark matter is like before we’ll actually find it; or we may just stumble upon a clue while doing something else. A high-energy collider is one of several places we might find a clue. As for dark energy, we have almost no persuasive ideas about its nature and how to understand it, and any studies are purely exploratory at this point.
One of the challenges for the LHC was the complicated backgrounds that must be subtracted. Fortunately the black hat team came to the rescue just in time and calculated the QCD backgrounds. If we continue with p-p collisions up to 100 Tev, will the background calculations (and subtractions) get more or less difficult? Naively I expect them to get easier, asymptotic freedom means we can get a good enough answer with less loops. Is there more to this story than just QCD loops?
I hope Matt does not mind my jumping in. Asymptotic freedom comes logarithmically. So the energy for it is far too high (much higher than 100 TeV) to help out with the background in the sense you are suggesting.Anyway let us see what Matt says.
Hi Kashyap, Good point, so it seems like the background calculations won’t get easier. If they get a lot harder then the question remains: Why measure “messy” p-p data between 10-100Tev if we don’t have the technology to subtract the Standard Model background? Do the funding bodies know enough to even ask this question?
Asymptotic freedom, IIRC, is a infrared not an ultraviolet effect, so it shouldn’t be more important in 100 TeV collisions. Higher energies don’t make any of the calculations, e.g., of the tree level decays of a B meson obsolete, they just open up the possibility that there will be additional branches in those tree level decays to end states that are no longer mass-energy conservation prohibited that more or less proportionately shrink the important of the old backgrounds, and they tweak the quark masses and the relevant coupling constants that run with energy scale. Higher energies tend to be better of perturbative QCD methods as opposed to lattice QCD methods. Moore’s law also helps. And, in many cases, QCD tells us exactly what terms must be in an infinite series and it is just a matter of calculating them with sufficient computer power. We also have innovative new Monte Carlo infinite series term sampling methods and tools like the amplituhedron to make effort/precision relationship better than it has been in the past.
There is more to the story than QCD loops however. QCD is in a bit of a crisis at the moment. Experiments in the last decade or so have been detecting a lots of excited charmonium and bottomonium resonances at masses and J^PC quantum numbers different than those predicted, while not finding the exotic hadron resonances that QCD has been predicting for decades at B factory energy scales (e.g. at Bessie, Belle, BaBar, LHCb). We haven’t seen many meson resonances that can’t be made out of q-q bar systems (except a couple of “meson molecules and a likely proton-antiproton system), but the linear combinations of q-q bar systems necessary to produce the observed resonances are complex and baroque. It is hard to figure out, a priori and predictively, how those linear combinations rather than the infinite number of other possible linear combinations of quarkonium ended up producing real resonances. Recent hep-ex preprint reviews of the situation by Olsen and by Choi have been posted within the last week or so. There are competing theories to explain this (e.g. extended linear sigma models)that can make adequate post-dictions, but none of these is a consensus view.
Presumably, extrapolating even higher energy scales using merely the old numerical QCD approximations that have produced the wrong answers so far is a bad idea. But, without a consensus on which of the solutions offered to post-dict the experimentally observed spectrums are correct or equivalent, it is hard to extrapolate those theories to much higher energy scales with great confidence, and those extrapolations with new theories do involve lots of time and money.
The reasoning in this article why a 100 TeV collider is “definitely” needed appears not very convincing to me. Some physics enthusiasts may consider it a natural way of doing science to build ever-larger accelerators, but I think better arguments are needed to explain why such a costly experiment shall be set up. Among my concerns:
1) No specific argument has been presented why an important advance in physical insight could be gained in the 100 TeV range. I remember other statements in this blog that interesting phenomena, such as related to super-symmetry (if exist), might be far far out of reach of current or near future accelators, possibly requiring energies not 10x but 1000x or 10E5 times … much more than accessible with a 100 TeV collider.
2) No specific argument has been given why building larger and larger colliders is considered an efficient and probably fruitful way to more insight. What is the road map of this strategy – after the 100 TeV accelator comes the 500 TeV collider and then the PeV collider, ever larger and more costly? The pure technical capability to build such large machines is not sufficient reasoning to do it. It should at least be considered the possibility that totally different kinds of experiments give a new turn to the advance of science. For example, space-based experiments that observe the high(est)-energy physics that naturally occurs in our universe.
3) I agree with the idea that it would be an important result if the LHC found nothing else of “new physics” than the Higgs particle. But the real value of an experiment is not just showing an unexplained (and unexpected) fact / measurement, but to give hints to a possible explanation. Suppose the 100 TeV collider does not find any new physics in its energy range – certainly very interesting, but this alone does not yet give too much of a clue what the concept of new physics would be. Such a result would invalidate a set of theories, but there is still an uncountable number of theories that may claim they are not impressed at all by such a result because their predictions are not testable by that collider. Can we give a measure about “how many” theories of the set of all possible meaningful theories can be invalidated by the experiments of a 100 TeV collider? I know this measure is difficult to define, as there may be an infinite amount of theories, or at least a very large, unknown number. However, if we could somehow demonstrate the a “considerable fraction” of prevailing theories could be invalidated by a certain experiment, this would be an argument in favor of doing it.
I am not saying that a 100 TeV collider should not be built, but in my opinion more convincing arguments should be presented in favor of it.
A 100 TeV collider would be great. But, honestly, I have to think that this is far from the most urgent funding priority for science, or even for physics. Are we so much worse off for discovering the Higgs a decade or so later than we might have, because the 40 TeV collider in the U.S. was cancelled?
One gianormous project like a 100 TeV collider means foregoing a dozen or more medium sized science projects and many of them are much closer to serious breakthroughs that are “targeted research”. For example, there are a lot more unknowns in the neutrino physics sector than there are in electroweak collider physics. Spending a tenth the money to chase a “known unknown” like neutrino oscillation or neutrinoless double beta decay studies, or buying a boatload of computational power to do QCD calculations that improve the backgrounds for our collider studies making every analysis more powerful (again for a fraction of the price of a 100 TeV collider) just seems like money better spent.
The big cheerleaders of a new 100 TeV collider are physicists who think SUSY is just around the corner. And, frankly, their credibility is pretty much shot. They’ve cried wolf one time to many.
Someday, a 100 TeV collider is something that someone should do. But, as I see it, there is no great rush to do so.
Hear, hear, hear
Well said ohwilleke.The SUSY folks are not yet convinced that 40 years of mental labor amounted to nada, zilch,nought, zero,hakuna
While there are certainly many small scale experiments that can be done, there reach is typically limited and there is always something to be said for an exhaustive search of the high energy frontier. For instance, you can spend the next thirty years creating all sorts of direct and indirect dark matter detectors, but you still won’t be able to probe and confirm even a small fraction of the parameter space that a new accelerator could, assuming you have something that has a standard thermal history. After awhile the economics justify themselves in terms of lost opportunity cost. I do however agree that there are certain experiments that really do justify themselves before an accelerator.
At the end of the day, Mats conference point holds in that we probably do need some input from theory advances before we can really weigh the economics pros and cons and it is possible that our priors are in need of updating.
The case of collider searches for dark matter is a good example of a case where waiting would be helpful. Astronomy data is greatly narrowing the parameter space of dark matter candidates at a rapid rate – and ruling out for the most part simply GeV+ scale WIMP CDM of the kind that the LHC and a 100 TeV collider would look for now. With a decent chunk of money spent on deep space satellites (but still much less than a 100 TeV collider), we could make much more progress on that front. Then, knowing better what we are looking for, we could build something better tailored towards finding it.
Yes, ever onward to higher energy! We (mankind) may have reached the point where perhaps it is only feasible to plan and eventually build one new collider for the planet (an international effort with many nations contributing – it’s my understanding that CERN already has 21 member nations) every so many years.
But we owe it to our own curiosity to build at least that ONE. These machines are our eyes to probe the very small.
And the same holds for the very large; so the James Webb Telescope is being constructed to replace the Hubble. Our eyes to probe the very large.
And I wouldn’t be too worried about experiments with regard to dark energy, dark matter, antimatter and neutrinos – there are many that are already planned, approved, and under construction, you will see several come online within the next decade. For example KATRIN (2015) will attempt direct measurements of neutrino mass – just to make sure the whole neutrino oscillation theory is on target. AEGIS (2015) will attempt to measure the gravitational acceleration of antihydrogen – just in case all you worshipers of the great and mysterious AL (Einstein) have gravity wrong. Then there will be experiments (can’t remember the name – perhaps DECAM – Dark Energy Camera) to determine at exactly what time (redshift) the universe went from deceleration to acceleration as many different theories of dark energy (including the AL’s cosmological constant) predict different results. Dark matter – just go to any science site (ScienceDaily, New Scientist, Physics World, etc) and see…
All the above are hot topics in physics and cosmology – they will be funded.
But we must plan and build to see more clearly both the very small and the very large.
You maybe referring to the dark energy searches to be done by the Hobby-Eberly telescope called HETDEX http://hetdex.org/.
Analising a poltical part: its inconceivable concretize that big project because the big problem it is geographic: where to build this giant?. In China it is impossible because, while they can get the money, at a cheap cost at this country, they do not have the technology. Then, this machine only can built in a neutral country, like switzerland. But there no have size area to do that. And the problem will continues. You do not addressed this political problem, the main requirement to do it.
Yay, Matt!! I saw Particle Fever yesterday and for a split second there you were – smiling with some colleagues at Cern.
“nor Russia, which is busy starting a disastrous invasion of its neighbor” – repeating brainwash of the US government is not good for a scientist. How can I trust the rest? The number of casualties in the “disastrous Russian invasion” of Crimea by much less than the number of homicides in South Chicago due to gunfire (in one day!). Maybe, indeed, there was a democratic referendum in Crimea where 90% of people expressed their will? Ok, this does not fit to the US propaganda machine, but are we still scientists? If we fail to create objective picture on such events by scanning through Internet resources, how can we make our minds when talking about more serious scientific challenges, such as super large collider?
Hi there to all, it’s in fact a pleasant for me
to pay a visit this web site, it consists of important
It is truly a great and helpful piece of information.
I’m happy that you simply shared this useful info with us.
Please stay us up to date like this. Thank you for sharing.
And even with a little helper spreading crumbs, the Shark can not be outdone.
Nevertheless, having correctly coated chocolate anytime you
want doesn’t mean that you can indulge in them indiscriminately.
Toaster ovens, which can cook anything from toast to whole chickens, reduce the use
of oil compared to a fryer resulting to a healthier meal.
It’s an amazing article designed for all the online users; they will take benefit from it
I am sure.
The inaugural Student Achievement Award went to Paula for exceptional student filmmaking.
When people are seeking interior design inspiration for their dining rooms,
they often turn to old-style French creations.
The chaise longue is one furniture that gained popularity from this
Hi, I desire to subscribe for this web site to take latest updates, thus where can i do it please help.
I take pleasure in, cause I iscovered just what I used to be looking for.
You’ve ended my 4 day lengthy hunt! God Bless you man. Havee a nice day.
Comments are closed.