Of Particular Significance

A Big Think Made of Straw: Bad Arguments Against Future Colliders

POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 06/09/2022

Here’s a tip.  If you read an argument either for or against a successor to the Large Hadron Collider (LHC) in which the words “string theory” or “string theorists” form a central part of the argument, then you can conclude that the author (a) doesn’t understand the science of particle physics, and (b) has an absurd caricature in mind concerning the community of high energy physicists.  String theory and string theorists have nothing to do with whether such a collider should or should not be built.

Such an article has appeared on Big Think. It’s written by a certain Thomas Hartsfield.  My impression, from his writing and from what I can find online, is that most of what he knows about particle physics comes from reading people like Ethan Siegel and Sabine Hossenfelder. I think Dr. Hartsfield would have done better to leave the argument to them. 

An Army Made of Straw

Dr. Hartsfield’s article sets up one straw person after another. 

  • The “100 billion” cost is just the first.  (No one is going to propose, much less build, a machine that costs 100 billion in today’s dollars.)  
  • It refers to “string theorists” as though they form the core of high-energy theoretical physics; you’d think that everyone who does theoretical particle physics is a slavish, mindless believer in the string theory god and its demigod assistant, supersymmetry.  (Many theoretical particle physicists don’t work on either one, and very few ever do string theory. Among those who do some supersymmetry research, it’s often just one in a wide variety of topics that they study. Supersymmetry zealots do exist, but they aren’t as central to the field as some would like you to believe.)
  • It makes loud but tired claims, such as “A giant particle collider cannot truly test supersymmetry, which can evolve to fit nearly anything.”  (Is this supposed to be shocking? It’s obvious to any expert. The same is true of dark matter, the origin of neutrino masses, and a whole host of other topics. Its not unusual for an idea to come with a parameter which can be made extremely small. Such an idea can be discovered, or made obsolete by other discoveries, but excluding it may take centuries. In fact this is pretty typical; so deal with it!)
  • “$100 billion could fund (quite literally) 100,000 smaller physics experiments.”  (Aside from the fact that this plays sleight-of-hand, mixing future dollars with present dollars, the argument is crude. When the Superconducting Supercollider was cancelled, did the money that was saved flow into thousands of physics experiments, or other scientific experiments?  No.  Congress sent it all over the place.)  
  • And then it concludes with my favorite, a true laugher: “The only good argument for the [machine] might be employment for smart people. And for string theorists.”  (Honestly, employment for string theorists!?!  What bu… rubbish. It might have been a good idea to do some research into how funding actually works in the field, before saying something so patently silly.)

Meanwhile, the article never once mentions the particle physics experimentalists and accelerator physicists.  Remember them?  The ones who actually build and run these machines, and actually discover things?  The ones without whom the whole enterprise is all just math?

Although they mostly don’t appear in the article, there are strong arguments both for and against building such a machine; see below.  Keep in mind, though, that any decision is still years off, and we may have quite a different perspective by the time we get to that point, depending on whether discoveries are made at the LHC or at other experimental facilities.  No one actually needs to be making this decision at the moment, so I’m not sure why Dr. Hartsfield feels it’s so crucial to take an indefensible position now.

Good Arguments in Favor

But if you had to make the decision today, the best arguments for such a machine are that

The first two arguments are very strong; the third has weaknesses, see below. Naturalness, which would imply that the Higgs boson should be accompanied by other types of particles or forces, is a principle that perhaps does not apply to the weak nuclear force; but it’s hardly a dumb idea, considering that it works perfectly for the strong nuclear force, and in other areas of physics. Meanwhile there’s hardly any understanding of the Standard Model’s structure — its array of forces and their strengths, and its array of particles and their masses, among other things. There’s clearly a lot more to be learned, and a versatile higher-energy collider is arguably the most likely place to go learn some of it.

Please note that these arguments in favor of this big future collider have nothing to do with either string theory or supersymmetry.

Good Arguments Against

What are the best arguments against building such a machine?  (Most do not appear in Hartsfield’s article.)

  • The solutions to the problems of the Standard Model, and to the naturalness puzzle, might not all lie at higher energy; perhaps some of the solutions are to be found elsewhere,
  • even if the solutions lie at higher energy, there is no guarantee currently that they lie within a factor of ten of the LHC’s energy,
  • the power consumption of such a machine would be high,
  • its potential cost is high, even in future dollars.

I consider these very serious arguments.  The last two arguments involve technological problems.  If we will only be able to use today’s technology in this future machine, then yes, these are going to be serious obstacles.  I can’t predict whether that situation will improve or not. Sometimes technological breakthroughs occur, and costs drop. 

The first arguments, meanwhile, have been central in my thinking, and that of others, for decades. Even 25 years ago, I would have told you that if the LHC were to find only a Higgs boson and nothing else, then the argument for a future high-energy machine would become suspectNaturalness argues that we ought to find something beyond the Standard Model at the LHC; if that argument is partially or completely evaded in nature, and the Standard Model survives all LHC tests, then it forces us to question whether the solution to the naturalness puzzle and/or other mysteries of the Standard Model all lie at higher energy. Maybe the key questions can only be addressed by a change in paradigm, and might require a completely different experimental approach. It’s not obvious, without an additional LHC discovery, that a higher-energy machine would bring new insights. And without clear, indisputable evidence in current experiments that the Standard Model is breaking down in some way, there is no way to infer the amount of energy that a new machine would actually need in order to address these questions.

These concerns have intensified as the LHC continues to confirm the Standard Model’s predictions at current collision energies.   If particle physicists want to push hard for an expensive, higher-energy version of the LHC, they do need to find more evidence that this is really the best direction. All the more reason to squeeze every ounce of information out of the experiments that we currently have.

Why This is All Premature

But in many ways, this whole discussion is getting ahead of itself.  First, the LHC era is not over; far from it.   It mystifies me how many people assume that because the LHC’s data hasn’t yet contradicted the Standard Model, there’s clearly nothing for the LHC to find. In fact, there are still many search techniques which have not even been tried, much less optimized. [Witness the long-lived particle workshop held last week, among others that focus on strategies for seeking possible dark matter particles and their friends. And that’s far from the whole story.]

Second, the collider Hartsfield is referring to is not the next machine.  It’s the machine-after-next.  And the next machine is one for which the arguments are clear, strong, and have nothing to do either with supersymmetry or string theory — nor are the costs anywhere near 100 billion. This would be an electron-positron collider that produces lots of Higgs bosons and lots of top quarks, in an environment that makes them easier to study.  We’re not talking about pipe dreams of string theorists or supersymmetry; these two types of particles are known to exist. But they are still imperfectly understood, and could hide important secrets. Indeed, if the naturalness puzzle does have a solution, or at least is a path to new knowledge, these are the two types of particles that are by far the mostly likely to reveal it.  

So let’s focus on the actual path ahead. The collider that Hartsfield refers to is two steps away, not one. Scientific and technological developments in the next decade or so may change the arguments for and against it, and sway the decision as to whether to build it or not. Science is about facing reality, and there’s no place for bogus arguments based on fantasies, such as those where the price tag’s 100 billion (do I hear 150? 200?), experimental and accelerator physicists don’t even warrant a mention, and the words “string theorist”, “supersymmetry devotee” and “theoretical particle physicist” are imagined to be synonymous.

Share via:

Twitter
Facebook
LinkedIn
Reddit

43 Responses

  1. /Searching at higher energy continues a path of exploration that has been scientifically fruitful in the past./

    /It mystifies me how many people assume that because the LHC’s data hasn’t yet contradicted the Standard Model, there’s clearly nothing for the LHC to find./

    High-Energy (frequency), like the Ultaviolet Catastroph, which infered the presence of Quantum reality (standard model?), is an Axiom.
    All experiments will work until this axiom is exhausted?
    This “Paradigm” does not mean the parametrization (Modelization) of the large scale observations, which needs different Sets (Paradigm) of Parameters?

  2. Great job in providing a balanced view of how you believe experimental HEPs should move forward, as you did in your talk at “Celebrating the science of Joe PolchinskiMatthew” which can be seen on Youtube: Strassler – On Strings From Things and Things From Strings.

    It looks clear to me that the era of particle colliders representing cutting edge experimental HEP is over as we move into experimental cosmology via satellites orbiting Earth and the continued unimaginable imagination of young experimentalists and theorists working together to build and observe what seems impossible right now. I think Thomas Hartsfield is a voice of the younger generation making it clear to his peers that it’s OK to embrace this future rather than the increasingly old fashioned era of high energy particle accelerators.

    1. I don’t know how you can justify the statement in your last sentence. Hartsfield’s not a particle physicist, and his arguments in the article are junk. If his peers choose to embrace a future without colliders, that’s absolutely their right; but for goodness sake, they should not make the choice based on straw man arguments that are full of errors. That would not reflect well on them.

      Frankly, if Hartsfield is a voice of the younger generation, it’s a voice that others in his generation should scorn… just as I ignored many voices in my own generation that were spouting silliness and garbage.

  3. As well as your point that the money wouldn’t go to other physics experiments in the first place (and the FCC(/LHC++ as they call it) would not cost anywhere near $100B) so the argument is moot… There’s also the point that even if it did go to other physics experiments, the idea it would fund 100,000 is laughable. That’s $1M an experiment. A team of five people working on a physics experiment for 10 years is a very small experiment, and $1M wouldn’t even come close to paying half of their salary, let alone equipment costs and everything else.

    They even come close to pointing this out in their own argument “There may not be enough physics labs on Earth to carry out that many experiments! ” which is of course correct, there aren’t enough physics labs on Earth for 100,000 new experiments… and $1M isn’t enough to build a typical physics lab, let alone fund the experiments.

  4. The discussion with Daniel and 4gravitons echoes much of what my sentiment is, too. However, I do want to add this, temporal, issue. The field had better hurry up to do whatever the next collider is. The LHC was conceived in the mid-1980s and used a repurposed tunnel, so new construction was relatively minimal. The LHC only has about 15 years left, and so to prevent a gap of even a few years between colliders ground needs to be broken now. Perhaps a muon collider at Fermilab could have a similar, relatively quick, schedule because the tunnels are already there. But an ILC, FCC, CEPC, etc., would all be enormous civil engineering undertakings, before any dreams of data collection.

    Otherwise there will be a generation of graduate students and post-docs that are not trained on a collider and not trained in experimental particle physics. While I haven’t verified it, I can’t think of a gap between any energy-frontier collider from the late 1960s (at least) until the present.

    1. Andrew, these are very serious concerns too. But one cannot move before one has funding, and one cannot get funding without a clear argument as to why funding should be provided. At higher energy, the only clear arguments, right now, are for a Higgs/top factory. That will be an electron positron machine of some type, unless the ambitious muon collider project can demonstrate enough likelihood of success and enough intermediate physics goals (e.g. neutrino beams) that it can be justified as well. (Meanwhile other colliders at lower energy but very high luminosity might be pursued.)

      I think success is likely to require being clever about repurposing existing facilities. A plan B for the e+e- machine or its tunnel, just in case there is neither political nor scientific will in the future to build a 100 TeV machine, might make it easier for people to accept the up-front costs of building such a large tunnel.

      1. Hi Matt, I completely agree. A Higgs factory was justified on July 4, 2012, and now it’s been 10 more years and still no clear collider is next, with broad global support. Perhaps Snowmass will tip the scales in a favorable direction for the field.

    2. Agreed with Andrew: the most pressing issue is not an energy-frontier collider itself, but ensuring that we don’t break the unbroken line of trained HE accelerator physicists stretching back to the birth of the field. If that means giving up on the energy frontier completely for 50 years and instead focusing on a “modest/cheap” Higgs factory or intensity frontier machine by repurposing the Tevatron or LHC tunnel, then that’s the realistic pill we need to swallow, in order to make sure we keep training people to build such things.

      And it must happen soon. I’m tempted to gloomily say that a lost generation in the field may permanently spell the end of the human race’s quest in fundamental physics. Because building an energy frontier collider from scratch, a century from now, with fresh physicists that have no hands-on training, is guaranteed to fail: it would be a managerial disaster on par with the SSC x10, and be cancelled after 5 years of floundering before ground is even broken.

  5. Is there a relationship between the strong force and the Maxwell Boltzmann statistics?

    Every point in space is literally a black hole and as long as radiation travels at the speed of light, the wave can pass through it. But as the speed lowers it will bend around the “center of mass” whether it’s millions of meters away or at the center of an electron.

    So, my point is, if there are no particles/resonances at any point, including at the center of an electron, then the “temperature”, energy state is at absolute zero, and hence the higher energy states will tend towards that zero point creating a “knot”, a very stable fundamental particle, electron, quarks, etc. So the energy states, the Maxwell Boltzmann distribution will essentially lock that particle in that configuration forever, the electron’s lifetime ~ 6.6 × 10^28 years.

    Here is the best part, the link from quantum to special relativity is from Maxwell Boltzmann’s statistics to the second law of thermodynamics.

  6. Matt,
    the two arguments about naturalness and the standard model details are only good arguments if the new accelerator can solve them. What is the evidence for this? I would bet 1000$ that the new accelerator (or the new LHC upgrades) will not find anything beyond the standard model. If you could tell a bit more why the accelerator will help in solving the two issues, it would be great. (And of course I agree that the article is low quality.)
    Best
    Klaus

    1. If you read this blog post, you’ll see that I myself say explicitly that there is currently no evidence that a new accelerator (specifically, a 100ish TeV proton-proton collider) can solve them… and that such evidence is going to be needed, if the arguments in favor are to convincingly win the day against the arguments against.

      As Daniel Harlow and others point out above, the one guarantee is that we will learn something about how the Higgs boson interacts with itself. That *could* give us a big surprise. We would also learn a lot more about how the Standard Model works in extreme environments, which would be very interesting, though not necessarily worth such a high price tag.

      Therefore, I conclude, maximum effort should be made toward extracting all possible information from current experimental facilities, with no stones unturned. We still have a lot of unturned stones. Meanwhile, a Higgs/top “factory” has to come first before any 100 TeV collider, and what we learn there might change our perspective.

  7. A good argumentation.

    Even if the price tag were $100 billion, and I surely think that this amount should be spent for big experiments in particle physics in a decade, it is just 1/1,000 of the world’s annual GDP, $100 trillion. Some $10 trillion were wasted just because of the Covid hysteria, another $10 trillion because of the insane war in Ukraine, and so on, but those are things that these people don’t dare to address. They think it’s courageous to attack the hardest discipline of natural scientists and boast that by destroying and banning science, they can save an amount that is 1,000 times smaller than the obvious waste that is taking place.

    On top of that, as you say, they completely misunderstand what is going on and how science works, how experiments and theories interact, what is the relationship between the hypothesis and an experiment. They think as naive attack dogs who mindlessly and repetitively defend some religious dogma, like two branches of Islam or something, and they pretend (or really believe) that the side of scientists is doing a mirror image of that. But it is not. It’s really cute that these people also use the slogan “Big Think” for themselves. It is one of the smallest ways of thinking that one may find.

    Add all their permanent horror caused by the fact that the theories may evolve, be updated, or contain parameters that may be adjusted. Those are the ultimate heresies. Needless to say, without either of these things, science would have been utterly impossible since the very beginning (Galileo and even further into the history) but why would they care? It’s enough for them when they find a sufficiently mentally incapable listeners who consider any improvement of our knowledge and theories to be a despicable heresy, and the screaming “it is a heresy” may continue.

    1. Good to see you Lubos on this blog. I find Matt’s blog also very educational like yours. I use to learn lot of mathematics and theoretical physics from your blog, sometimes after annoying you by some questions!! How can I sign up for your blog?

  8. It’s maybe also worth mentioning about this “$1M experiment” notion. Pretending that might be true, it would not be a surprise if the LHC experiments publish in total well over 10k papers in the lifetime of the machine. So… success?

    1. Yes, you’re right that this is an important point. Some experiments done in a professor’s lab can only make one measurement; some make ten or a hundred. A big facility like a telescope or an accelerator can make tens of thousands. So it’s not so easy to decide what the cost is per unit knowledge… another way in which Hartsfield’s argument is crude at best.

  9. I have no comments on Hartsfield’s ignorant rant.

    In terms of arguments for new colliders, another argument which is worth mentioning is that we already have strong evidence for particle physics beyond the standard model via dark matter, baryogenesis, and neutrino oscillation. We don’t know that any of that will show up at a next-gen collider of course, but it might and anyways it shows that dragons are out there somewhere and they are well below the Planck scale.

    A tangent: I’m far from an expert, but the muon collider is looking more and more exciting to me as the future of US particle physics. It would even fit on the Fermilab site.

    1. Hi Daniel; I agree that we have evidence for physics beyond the Standard Model, though one should be careful about claiming it is all “particle” physics.

      But the problem remains that building an expensive machine to go fishing is hard to justify if there isn’t a guarantee of something within reach that is commensurate with the cost. The solutions are (a) bring the cost down, or (b) obtain evidence that the stakes are high enough, and the likelihood of success high enough, to justify the cost. Ideally, we’d have both (a) and (b).

      In the meantime, I think we have to consider whether we’re doing all of the clever experiments we should be doing; for instance, measurements of gravity’s force law at millimeter scales didn’t seem interesting to particle physicists, until suddenly, in 1998, they were. What else is like that? Cf. the work of people like Peter Graham et al. And also, are we fully exploiting the LHC? (Spoiler alert: No.)

      1. We also certainly have “non-particle” physics beyond the standard model, i.e. gravity, but I wouldn’t want to try to sell a collider based on that.

        The problem I have with the arguments you presented (naturalness and the mysterious structure of the standard model) are that they are ultimately aesthetic. Moreover because of the epic failure of naturalness for the cosmological constant, and its continuing failure for the electroweak phase transition, I think basing arguments for future colliders on it is a bad idea both scientifically and politically. To me better arguments are 1) we already know there is new physics below the Planck scale so we might as well see if we can learn more about it and 2) we still need to understand the Higgs sector better, i.e. we still don’t actually know whether the potential is -phi^2+phi^4 or -\phi^2+\phi^6. The muon collider brings the added appeal of doing something really cool for the first time, which I think shouldn’t be discounted either politically or in terms of motivating people to work on it.

        1. I don’t think I agree about the “aesthetic” complaint. There’s clearly something conceptually deep regarding the questions of naturalness and the SM’s structure that we do not understand, even if the answer is that “nature did this by chance” — because we don’t yet understand how that would be possible. (Unlike the cosmological constant, the anthropic arguments for Higgs naturalness *do not work out of the box*; this is the “artificial landscape problem” that I’ve been talking about for ten years, notably at Joe P’s 60th conference, which I think you were at.)

          By contrast, questions such as the origin of neutrino masses or dark matter, while very important technically, may or may not turn out to be deep conceptually. It might just be that we add a couple of fields and a couple of terms to the SM Lagrangian, and then we’re done; will we really be any wiser? Neither one needs to show up at 10 TeV; both might be explained at a slightly reduced Planck scale of 10^12 TeV. And non-collider experiments may be more fruitful places to investigate them, so they don’t add much to arguments for a 100 TeV machine.

          Indeed, whether any of my questions or yours have an explanation far below the Planck scale isn’t obvious, and that’s a serious problem.

          As for the Higgs potential, I agree that’s important, but it’s hard to justify an entire collider just for that… unless we have some evidence that something funny is going on with the Higgs, or with the electroweak phase transition, from some other experiment.

          As for muon colliders, I’m totally in favor of doing the R+D. It’s going to take a while but someday they will be a fantastic tool… if the cost of building and operating one is reasonable. Too early to say.

          1. To butt in here:

            “As for the Higgs potential, I agree that’s important, but it’s hard to justify an entire collider just for that… unless we have some evidence that something funny is going on with the Higgs, or with the electroweak phase transition, from some other experiment.”

            I think one can argue that we do have some evidence of this sort, due precisely to the failure of naturalness for the EW scale at the LHC. Naturalness ultimately rests on the idea that it’s hard to get a low-energy parameter somewhere RG doesn’t “want it to go” without an extremely fine-tuned parameter at higher-energy, and thus without a theory that is even harder to explain. That same principle would tell us that the Higgs potential cannot be -\phi^2+\phi^6, and must be close to -phi^2+phi^4 (with maybe also a small \phi^6 and higher terms), because a \phi^6 coupling would get smaller at lower energies.

            That seems to suggest that a Higgs factory would be an extremely strong test, verging on an experimentum crucis, of the naturalness paradigm in general. Naturalness at the LHC predicts that something beyond the Higgs should show up, but as you say there are still many things that could have been missed, so it doesn’t really rule naturalness out entirely, just make it dubious. In contrast, the core principle behind naturalness (no fine tuning of high energy parameters to get an RG-unfavored low-energy parameter) makes a definite prediction for the Higgs potential, namely that it has a large \phi^4 term, and this prediction would definitely be testable at a Higgs factor. If fine-tuning is actually fine, then there is no reason to expect a \phi^4 term at all, so this is really a unique prediction of the no-fine-tuning “model”, one that even predicts a specific numerical value for the coefficient!

            But ok, I am not a particle phenomenologist, and the only other person I’ve seen make an argument like this (Daniel above) is even less of one. So I kind of suspect we’re both confusing something basic here, since I’ve never heard a particle phenomenologist say anything like this. When I talk to people about it, the one thing that tends to get brought up is positivity bounds, but talking to some of the people that work on those it’s far from clear that they guarantee a \phi^4 term. Maybe they do, but I don’t think anyone’s shown that yet. So if there’s something else Daniel and I are missing I’d love to hear it!

            (By the way, “fine-tuning is fine”, at least in my view, doesn’t mean “anthropic multiverse”. It can also mean “there are inexplicable brute facts about the universe” – this is Sabine Hossenfelder’s take as far as I can tell. And it can mean “there is some form of UV-IR mixing” which would be weird and nonreductionist but maybe there’s some way to make it work? It’s kind of a thing in holography anyway.)

            1. I agree that “fine tuning is fine” does not imply “anthropics”; I just used the latter as an example. Conversely, anthropics does not generally imply that fine tuning is fine, as I pointed out with the artificial landscape problem.

              Regarding naturalness and a higgs H^6 potential; what we actually measure is the local potential of fluctuations h around the minimum H=v. Once you have three potentially important terms in the Higgs potential (H^2,H^4,H^6), the h^3 interaction is not uniquely determined. You will need to measure h^3 and h^4 interactions to determine the potential, and that isn’t easy. The only thing that can easily be measured is h^3, and that won’t be definitive.

              Moreover, if the H^6 interaction is important, it will induce an H^4 term, not from some high scale cancellation, but just from renormalization within the effective theory between 10 and 1 TeV. So to say the H^4 term is zero is not meaningful. At what scale are you claiming that this “zero” applies?

              1. I wouldn’t say we need to phrase this discussion about the Higgs potential in terms of naturalness, we are just testing the hypothesis that the standard model is valid to scales which are high compared to the electroweak scale. In terms of the Taylor-expanded Higgs potential, I believe the situation is the following (although I haven’t thought about it in a while): given that we have measured the Higgs mass and vev, and also directly measured e.g. the top quark Yukawa via the Higgs production cross section, the renormalizable standard model makes predictions for both the cubic and the quartic self-interactions of the Higgs. By adding not-too-suppressed irrelevant operators to the Higgs potential I think we can mess up both predictions. One example of this is setting the coefficient of H^4 to zero at the electroweak scale and using H^6 to generate the vev, but I (of course) don’t have any particular attachment to that model. Rather I’d just like to see if the above two predictions of the standard model are correct, as otherwise we will have falsified the hypothesis that the standard model is self-consistent to much higher energies. I emphasize that until we confirm these predictions, any claim that we have verified the vanilla standard model rests on assuming that there is no new physics at low-ish energy.

                1. There are a lot of ways that the vanilla standard model could break down if there is new physics at low-ish energies. You’re focusing on the renormalizable interactions of the Higgs, but you’re using a non-renormalizable interaction to get them to be out of whack, and there are many other non-renormalizable operators to add in the Standard Model effective theory. They’d all be accessible at an LHC successor. But many of them would be accessible sooner, through precision measurements at lower energy. So I think we should focus on them first. If we find problems among those predictions, then we know 100 TeV is a good target energy.

              2. Your last paragraph points out something I was legitimately being dumb about: yes of course you’d get an H^4 generically at some scale. You can make it very small at every relevant scale though (say, every scale close to the EWSB scale at least), if you’re willing to accept arbitrary fine-tuning, right?

                Regarding your second paragraph, yes, if all the terms are large you would need to measure both h^4 and h^3. But if all terms are large, it would be a very surprising coincidence if h^3 had exactly the value you’d predict if the H^6 term was negligible, right? So I think observing h^3 at very close to the predicted value would still be a massive win for the “fine tuning is bad actually” camp.

                1. A top quark loop will also induce an H^4 term, by the way, and not a particularly small one.

                  I think this discussion is too nebulous. Give me a theory that has a principle that predicts the H^4 term is tiny at some scale, while the H^6 term at that scale is large enough to generate the 246 GeV Higgs expectation value, and then let’s talk.

                2. Ok, the latter comment makes me think you’re misunderstanding my argument.

                  There is a principle that underlies naturalness arguments. I’ve paraphrased it as “no fine tuning”, but it can equally well be phrased as “theories should be based on principles”. It’s the idea that, for every feature of the low-energy theory, there is a high-energy theory that doesn’t merely give that result “because the parameters are such and such”, but actually “explains” it, by being set up in a way such that that is the only (or the “natural”) way things end up.

                  That’s what’s at stake here. If someone is ok with fine-tuning (for vaguely justified multiverse reasons, Hossenfelderian “there are brute facts and we just have to accept them” reasons, or some other reasons), then they don’t think facts at low energy need to be determined by principles at high energies, full stop. And if someone has that attitude, I’m arguing that a Higgs factory is a uniquely good machine to prove them wrong and show that the “fine-tuning is bad” side actually has a point.

                  Based on what you’ve been saying to Daniel, there are two potential issues with the argument that I’d like to ask you about:

                  First, are there naturalness-preserving ways to make the h^3 term dramatically different from predicted? I know people are looking for various deviations of this term, but is there any possibility compatible with naturalness of an O(1) discrepancy?

                  Second, you brought up with Daniel that there are other dimension six operators that can be tested at the LHC and HI-LHC. The impression I have is that all of these are already constrained to be “small” by the fact that they’re not competing with tree-level SM processes. Are there any that could be competitive with already known renormalizable interactions, and aren’t sufficiently constrained yet?

                  (Do the SMEFT folks publish an updated list of constraints on each SMEFT parameter? I really ought to ask some of the local SMEFT folks here…)

                  Essentially, I think that a big difference between “fine tuning is bad” and “fine tuning is fine” (or “theory is determined by describable principles” vs “theory is determined by empirically findable brute facts”) is that the latter has no reason to expect higher-dimension operators to be smaller than lower ones, at any accessible energy. For most such operators, we happen to know this is not the case, the renormalizable operators dominate. We don’t know this for the Higgs, so it’s kind of the last place that perspective would make really crisply different predictions from the physics mainstream.

  10. I agree with you completely as to what is next, with one addition or slight change.
    I would suggest an intermediate step, different from “just” an HL-LHC.
    That would an additional detector or detectors for the LHC (plain or HL) which are specifically for looking at things happening which are NOT tagged by high transverse momentum. This would include exotic bound states as well as looking more closely at the internal details of jets. In other words,
    more search of non-perturbative things.

    Presumably this idea would carry over to an electron-positron device too.

    1. The question of whether to equip the LHC with new detectors that can allow it to search for physics that the current detectors would miss is one that deserves our full attention, I agree. That is certainly cheaper than building an entirely new facility, and we should get everything we can, within reasonable cost, from the existing machine. Many such detectors have been proposed and at some point I’ll write about some of them. Do you have a specific one in mind?

Leave a Reply to Andrew LarkoskiCancel reply

Search

Buy The Book

A decay of a Higgs boson, as reconstructed by the CMS experiment at the LHC

Related

I recently pointed out that there are unfamiliar types of standing waves that violate the rules of the standing waves that we most often encounter

POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 03/25/2024

One of the most challenging aspects of writing a book or blog about the universe (as physicists currently understand it) is that both writer and

POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 03/21/2024