Day 3 of the SEARCH workshop (see here for an introduction and overviews of Day 1 and Day 2) opened with my own talk, entitled “On The Frontier: Where New Physics May Be Hiding”. The issue I was addressing is this:
Even though dozens of different strategies have been used by the experimenters at ATLAS and CMS (the two general purpose experiments at the Large Hadron Collider [LHC]) to look for various types of new particles, there are still many questions that haven’t been asked and many aspects of the data that haven’t been studied. My goal was to point out a few of these unasked or incompletely asked questions, ones that I think are very important for ATLAS and CMS experts to investigate… both in the existing data and also in the data that the LHC will start producing, with a higher energy per proton-proton collision, in 2015.
I covered four topics — I’ll be a bit long-winded here, so just skip over this part if it bores you.
1. Non-Standard-Model (or “exotic”) Higgs Decays: a lightweight Higgs particle, such as the one we’ve recently discovered, is very sensitive to novel effects, and can reveal them by decaying in unexpected ways. One class of possibilities, studied by a very wide range of theorists over the past decade, is that the Higgs might decay to unknown lightweight particles (possibly related in some way to dark matter). I’ve written about these possible Higgs decays a lot (here, here, here, here, here, here and here). This was a big topic of mine at the last SEARCH workshop, and is related to the issue of data parking/delaying. In recent months, a bunch of young theorists (with some limited help and advice from me) have been working to write an overview article, going systematically through the most promising non-Standard-Model decay modes of the Higgs, and studying how easy or difficult it will be to measure them. Discoveries using the 2011-2012 data are certainly possible! and at least at CMS, the parked data is going to play an important role.
2. What Variants of “Natural” Supersymmetry (And Related Models) Are Still Allowed By ATLAS and CMS Searches? A natural variant of supersymmetry (see my discussion of “naturalness”=genericity here) is one in which the Higgs particle’s mass and the Higgs field’s value (and therefore the W and Z particles’ masses) wouldn’t change drastically if you were somehow to vary the masses of superpartner particles by small amounts. Such variants tend to have the superpartner particle of the Higgs (called the “Higgsino”) relatively light (a few hundred GeV/c² or below), the superpartner of the top (the “top squark”, with which the Higgs interacts very strongly) also relatively light, and the superpartner of the gluino up in the 1-2 TeV range. If the gluino is heavier than 1.4 TeV or so, then it is too heavy to have been produced during the 2011-2012 LHC run; for variants with such a heavy gluino, we may have to wait until 2015 and beyond to discover or rule them out. But it turns out that if the gluino is light enough (generally a bit above 1 TeV/c²) it is possible to make very general arguments, without resort to the three assumptions that go into the most classic searches for supersymmetry, that almost all such natural and currently accessible variants are now ruled out. I say “almost” because there is at least one class of important exceptions where the case is clearly not yet closed, and for which the gluino mass could be well below 1 TeV/c². [Research to completely characterize the situation is still in progress; I’m working on it with Rutgers faculty member David Shih and postdocs Yevgeny Kats and Jared Evans.] What we’ve learned is applicable beyond supersymmetry to certain other classes of speculative ideas.
3. Long-Lived Particles: In most LHC studies, it is assumed that any currently unknown particles that are produced in LHC collisions will decay in microscopic times to particles we know about. But it is also possible that one or more new type of particle will decay only after traveling a measurable distance (about 1 millimeter or greater) from the collision point. Searching for such “long-lived” particles (with lifetimes longer than a trillionth of a second!) is complicated; there are many cases to consider, a non-standard search strategy is almost always required, and sometimes specialized trigger strategies are needed. Until recently, only a few studies had been carried out, many with only 2011 data. A very important advance occurred very recently, however, when CMS produced a study, using the full 2011-2012 data set, looking for a long-lived particle that decays to two jets (or to anything that looks to the detector like two jets, which is a bit more general) after traveling up to a large fraction of a meter. The specialized trigger that was used requires about 300 GeV of energy or more to be produced in the proton-proton collision in the form of jets (or things that look like jets to the triggering system.) This is too much for the search to detect a Higgs particle decaying to one or two long-lived particles, because a Higgs particle’s mass-energy [E=mc2 energy] is only 125 GeV, and it is rather rare therefore for 300 GeV of energy in jets-et-al to be observed when a Higgs is produced. But in many speculative theories with long-lived particles, this amount of energy is easily obtained. As a result, this new CMS search clearly wipes out, at one stroke, many variants of a number of speculative models. It will take theorists a little while to fully understand the impact of this new search, but it will be big. Still, it’s by no means the final word. We need to push harder, improving and broadening the use of these methods, in order that decays of the Higgs itself to long-lived particles can be searched for. This has been done already in a handful of cases (for example if the long-lived particle decays not to jets but to a muon/anti-muon pair or an electron/positron pair, or if the long-lived particle travels several meters before it decays) and in some cases it is already possible to show that at most 1 in 100 to 1000 Higgs particles produce long-lived particles of this type. For some other cases, the triggers developed for the parked data may be crucial.
4. “Soft” Signals: A frontier that has never been explored, but which theorists have been talking about for some years, is one in which a high-energy process associated with a new particle is typically accompanied by an unusually large number of very low-energy particles (typically photons or hadrons with energy below a few GeV). The high-energy process is mimicked by certain common processes that occur in the Standard Model, and consequently the signal is drowned out, like a child’s voice in a crowded room. But the haze of a large number of low-energy particles that accompanies the signal is rare in the mimicking processes, so by keeping only those collisions that show something like this haze, it becomes possible to throw out the mimicking process most of the time, making the signal stand out — as though, in trying to find the child, one could identify a way to get most of the people to leave the room, reducing the noise enough for the child’s voice to be heard. [For experts: The most classic example of this situation arises in certain types of objects called “quirks”, though perhaps there are other examples. For non-experts: I’ll explain what quirks are some other time; it’s a sophisticated story.]
I was pleased that there was lively discussion on all of these four points; that’s essential for a good workshop.
After me there were talks by ATLAS expert Erez Etzion and CMS’s Steve Wurm, surveying a large number of searches for new particles and other phenomena by the two experiments. One new result that particularly caught my eye was a set of CMS searches for new very heavy particles that decay to pairs of W and/or Z particles. The W and Z particles go flying outwards with tremendous energy, and form the kind of jet-like objects I mentioned yesterday in the context of Jesse Thaler’s talk on “jet substructure”. This and a couple of other related measurements are reflective of our moving into a new era, in which detection of jet-like W and Z particles and jet-like top quarks has become part of the standard toolbox of a particle physicist.
The workshop concluded with three hour-long panel discussions:
- on the possible interplay between dark matter and LHC research (for instance: how production of “friends” of dark matter [i.e., particles that are somehow related to dark matter particles] may be easier to detect at the LHC than production of dark matter itself)
- on the highest priorities for the 2013-2014 shutdown period before the LHC restarts (for instance, conversations between theorists and experimentalists about the trigger strategies that should be used in the next LHC run)
- on what the opportunities of the 2015-2020 run of the LHC are likely to be, and what their implications may be (for instance, the ability to finally reach the 3 TeV/c2 mass range for the types of particles one would expect in the so-called “Randall-Sundrum” class of extra-dimensions models; the opportunities to look for very rare Higgs, top and W decays; and the potential to complete the program I outlined above of ruling out all but a very small class of natural variants of supersymmetry.)
All in all, a useful workshop — but its true value will depend on how much we all follow up on what we discussed.
46 Responses
When I initially commented I clicked the “Notify me when new comments are added” checkbox and now each time a comment is added I get three e-mails with
the same comment. Is there any way you can remove me from that service?
Thanks!
My mistake … if we apply the i versor to a scalar, the result is a 90 degree rotation applied in the scalar number… If we multiply the i versor with itself, it will be a second 90 degree rotation that is added to the original 90 degree rotation that the i versor itself “contains”, thus, i^2 == -1.
In the case of Minkowski’s space, the i versor applies a 90 degree rotation to the 3 dimensional space of the 3 real dimensions, which means that time is in a fourth dimension that is normal (perpendicular to) the other 3 real dimensions that we know of.
BTW, Minkowski’s space-time is a four dimensional manifold in a complex space, so, when we mix complex variables with integer powers, many “funny” things happen with the sign.
Complex variables always have at least one imaginary coordinate, that is clearly identified by the i versor (a vector of unitary magnitude or size).
The i versor has some interesting “geometric” properties.
When we multiply the i versor with itself, it behaves in such a way that we can say that it rotates 90 degrees in the positive direction of angles (depending on the convention selected for positive angles).
This rotation takes place within the same plane that contains the i versor.
So, if we have an i versor and multiply it once with an i versor, it will rotate 90 degrees. If we apply the same transformation again, it will rotata again another 90 degrees, so, at the end of the second transformation, the original i versor will have accumulated a 180 degree rotation, which means that now is pointing in the opposite direction in comparison to the original direction.
This is the same as saying that it has changed its sign, thus, i^2 == -1.
On short, when it comes to particle physics, we just can’t do word games with Energy and Mass and Waves and Particles: we have to follow the rules of Physics.
Kind regards, GEN
Nice argument, but that is not the way to determine the outcomes of particle-antiparticle pairs.
When I say outcomes that is not typo, as on most cases, any given pair of particle-antiparticle may end up having many possible outcomes.
These computations are really complex and difficult to follow through to get the right answers.
This is the result of the stochastic (probabilistic) behaviour of quantum particles.
The computations required to determine the outcomes are based on Path Integral formulas originally devised by Richard P. Feyman, and the way to simplify and account for all possible paths (all possible outcomes down the line) is with the use of Feynman diagrams.
Path Integral calculations and Feyman Diagrams take into consideration all relevant aspects of the behaviour of quantum particles, including all conservation laws at play.
Regarding negative energies, positive masses and the “equivalence” of mass and energy, there are a few things (and subtleties) that have to be considered that a bird’s eye view approach may miss.
To start with, the equivalence of mass and energy is the result of a pythagorean relationship between Energy, Momentum and Mass, and not just a linear and direct relationship between just Energy and Mass.
This pythagorean equation considers an instantaneous equivalence, that is , an equivalence that exists a given “point” in space-time (any such point has 4 values as coordinates), and even though that equivalence stays true at all times, the actual numeric values change with the coordinates in space-time.
When we compare a start pair of particle-antiparticle with the next end pair of pair-antiparticle within the same decay path, we have to consider all these elements (Energy, Momentum and Mass, pythagorean relationship between them that involves the square or second power of each term, etc.).
If we do the math with these equations, it is very simple to see how energies can have different signs and masses remain positive, for the cases of particles-antiparticles that present this particular behaviour .
Kind regards, GEN
BTW, bosons like photons are their own antiparticle, and that is a rather “quirky” case: they both have positive energies!
But they were chasing after the wrong set of forces that allowed a “smooth” form of unification. It was Glashow, Weinberg and Salam that proposed a robust unification model: the electroweak force.
Einstein, and then Kaluza and Klein, just to name a few.
Does the existence of antimatter suggests that there are other dimensions with negative energy?
Our measurement, thus far, indicate that the universe is flat. It could also be so much bigger than we can imagine that we only measure a small sector of a closed universe.
If one combines the two concepts, negative energy and closed universe, is it invalid to say that energy is the fluctuations of space-time caused by difference in temperature between the two domains.
I know negative energy is use in theories on wormholes, but I am not talking about wormholes, rather a very, very large closed universe with a variable temperature shell. We see this model everywhere from stars to planets down to the atom and below.
Membranes created by positive and negative energy.
Mr Oak Tree, this is what I also contemplating.
This is an opinion, as I’m no subject matter expert on this field, but I do not get this “extra dimensions” thing. When Dirac came to ponder the possible consequences of the negative sign of that famous square root, he started to realize that it was not a mere mathematical artifact with no actual physical meaning, in fact, there was ample room for a very distinct physical meaning to consider both roots of that quadratic equation. That quadratic equation was defined within the boundaries of the “normal” dimensions we know of (3 spacial dimensions and 1 temporal dimension), so, there is no reason to consider extra dimensions to explain anti-matter: negative energy, indeed, but within the very same dimensions we know and cherish.
There’s no relation between anti-matter and extra dimensions. Anti-particles are required by quantum field theory, period; if you combine special relativity and quantum mechanics, anti-particles are required for consistency.
Extra dimensions is a conceptually separate possibility. Such additional dimensions (or something even more exotic) may be required also, but by some other consistency condition. That’s what happens in string theory, for instance.
Not only anti-particles are required by Quantum Field Theory: they show up all the time during HEP experiments. In the case of electrically charged particles, the experiments show how the particle and its anti-particle spin in opposite directions when exposed to a magnetic field, and that is a clear indication that anti-matter is manifesting itself within the very same dimensions that matter does.
String theory proposed a model with many more dimensions to be able to accomodate all known forces in a unified framework, or so is my understanding of how the “excess” dimensions appear.
Note extra dimensions were proposed long before string theory; Einstein worked with this possibility for much of his last 30 years of research.
I believe string theory was the first theory to *require* either more dimensions or something even more exotic.
“In the case of electrically charged particles, the experiments show how the particle and its anti-particle spin in opposite directions when exposed to a magnetic field, and that is a clear indication that anti-matter is manifesting itself within the very same dimensions that matter does.”
Not so simple, my friend. When both particles come “in contact” their energies are nulled. Conservation of momentum, yes, but what is realy happening? What is the fundamental definition of energy to explain conservation laws?
What I am asking is, is there a “tug of war” of sorts between two or more spatial frames that are creating the space-time fluctuations we see on “our side”? In other words, can a system, our manifold, create it’s own energy or must the energy can only be created through the interactions with adjacent systems? Hence, because we see the conservation of energy, it would logically suggest there must be negative energies to create a boundary conditions that give rise and explain antimatter.
Why separate spatial frames? … Because the physics we have is telling us that are we go down in scale it is leading to “nothingness” and what is this nothingness which has created such a magnificent universe and a lot of energy (motion) … spatial fluctuations? … the tug of war for space-time?
Just a thought … 🙂
Any particle and its antiparticle do not just “cancel each other out”, the conservation equations are a little bit more convoluted than that.
Since we are having an amicable conversation as guests of Matt’s blog, it makes a lot of sense to refer back to one of Matt’s articles on particle-anti-particle interactions.
In one of his articles pertaining to this subject, he uses a very popular expression regarding these conservation rules:
particle1 + anti-particle1 = particle2 + anti-particle2
This is a rather interesting expression, since it is elegant and at the same time, very eloquent without the need of using too many letters.
Due to the equivalence of mass and energy, these conservation equations can get rather quirky at times, depending on the characteristics of the initial pair of particles (are they fermions? are they bosons? what kinds of charges do they have? electric charges? color charges? ???).
Even though energies do cancel out, masses do not cancel out (energies could be either positive or negative, but masses are always “positive”).
That means that the sum of the initial masses must turn into something equivalent in the final pair of particles, like say the “invariant” masses of the final particles, or particles zero “invariant” mass but with “relativistic” mass, like a pair of photons, as long as the sum of the initial masses is equal to an equivalent amount of mass/energy in the final pair of particles.
The physics might be “eloquent:, as you put, within the boundaries, but I am inquiring what is happening at the boundary (ies)?
“Due to the equivalence of mass and energy, … Even though energies do cancel out, masses do not cancel out (energies could be either positive or negative, but masses are always “positive”).”
You mention the equivalence of mass and energy and yet you quickly differential the twp concepts in your very next paragraph …”masses are always positive”. Masses are always positive, … now what does that mean if energies can be positive and negative?
I hope I am saying this properly, please correct my physics if I go off in left field, 🙂 , If all masses were to null out the universe would consist of just “plane waves”, collinear and of one frequency I presume. An analogy I like to use are ocean waves (are the plane waves) and where whitecaps are the mass particles created by some external disturbance, what i referred as the “tug of war” between two (and maybe more) spatial frames.
The concept that escapes us, I dare say, is that we maybe looking for the boundaries in the wrong directions. The boundaries are neither at infinity (time) nor at zero, (spatial point) because neither exist. And this is why I say the universe must be closed. Whether it is a spherical shell, balloon, or some very convoluted closed surface, I don’t know. What I do think is that every point touches the boundary, cannot visualize it I afraid to say.
I mean can physics ever prove that many variations of constants, principles, and rules can generate a life permitting universe.
It is difficult to imagine proving anything definitively. Working out the full consequences of any set of laws of nature is extremely difficult in the best of circumstances and impossible in most circumstances. However, if the universe lasts a billionth of a billionth of a second, or has no objects in it larger than a proton other than black holes, it probably isn’t suitable for life.
I will try to respond without interjecting my own ideas. I am merely trying to raise some possibilities in the minds of others that do not seem to be given consideration here.
Is it possible that “super- symmetry” does indeed exist, but is not a reflection of extra dimensions but of an underlying sub-particle structure that constitutes both fermions and bosons? Is it possible that, in this sense, a W- is the super-symmetric partner of the electron? I know that “preon” and “sub-quark” models have been constructed in the past, but I doubt we have exhausted the possibilities. I am just saying it might be an avenue worth pursuing.
Yes, this kind of thing was proven to be impossible in the 1970s. It is known as the Haag–Lopuszanski–Sohnius theorem.
The (left-handed) electron forms a doublet with the neutrino [i.e. the two are paired] under weak isospin (the “charge” of the weak nuclear force).
The W+, W- and Z0 form a triplet under weak isospin.
Note that these statements are not conjectures; they have been experimentally tested, to a precision and accuracy of 1% or better, at the LEP collider and previous experiments.
The Haag–Lopuszanski–Sohnius theorem (and earlier Coleman-Mandula theorem) then says that (in a world with three spatial dimensions) if there is a symmetry which relates the electron to the W [i.e., an object in a doublet of isospin to an object in a triplet of isospin], then the theory with this symmetry cannot exhibit anything resembling normal particle scattering. That’s contrary to experiment of course, since scattering happens all the time.
And the Haag-et-al. theorem further says that the only symmetry which you can add to Einstein’s special relativity that relates particles with different spins is the standard form of supersymmetry.
So it’s not that people’s minds are closed. They considered these types of options several decades ago — until a rigorous theorem slammed the door shut.
In short: if you want to raise this possibility, go study the Haag-et-al. theorem and find a loophole. Maybe there is one… but the onus (since many have tried and failed) is on you.
Where can I find refutations of fine tuning as you claimed that there are good such papers . No real refutation is possible….”.facts. Prove fine tuning
fine-tuning is not something you refute… I don’t understand your question.
Hello professor how are you?
Here are a couple of irrelevant question for you and/or you’re readers.
1. Is supersymmetry equivalent to a singularity?
2. Is a singularity a state where a space (energy) is confined to one (and only one) uniform temperature (albeit extremely high)?
3. Is the Big Bang the condition where all these points of maximum temperature combine (connect) to one space (one singularity) and release again for cycle of this universe?
4. Do we have it backwards, is the visible universe organized chaos and the singularity a very unstable state of order? i.e. one temperature is order while a variable temperature medium is chaos?
5. Is the attractive force we call gravity is the mechanism (the physics) which drives the universe back to order and the rest of the fields. like the fermions and quarks and bosons, are the by products of variable energy density fluxes caused by the singularity (as defined above) expanding and hence cooling in a random profile (probably the best way to define infinity, i.e. maximum permutations).
Oh yes one more, why is this blue moon stuck in one place outside my window? Smallest orbit?
Mr Oak Tree,
/1. Is supersymmetry equivalent to a singularity?/– This is my question also.
Electrons and positrons were popping in and out of vacuum. We measure only the observable dimension. We cannot (predict) go back into negative light cone (past events), because of entropy. We cannot predict future (positive light cone), because of unfolding. So speed of light is freezed, and constant everywhere – it may be scalar (constancy), but light is not scalar. This constancy creates the “rest mass”. Ref Mr Arkani-Hamed: The term massless particles (photon) is an artifice from which we can imagine figments like gauge invariance.
Spacetime may also be a mathematical artifice to manifest this vacuum priori. But the singularity is prevented by Time dialation. Singularity may occur in high energies like Planck scale.
Our experiments may be limited to some angles like this ?
There is no connection between supersymmetry (a larger symmetry of space and time than Einstein’s special relativity) and a singularity (something infinite, whose meaning depends on what type of singularity you’re talking about, but in the case of space and time typically refers a place where space-time curvature becomes infinite.)
“… but in the case of space and time typically refers a place where space-time curvature becomes infinite.”
I forget which of your articles I posed a question about quantum confinement and I called it a spinor, probably the wrong word to use. But my inquiry is whether these infinite curvature of a space-time point, probably the smallest unit(s) of both space and time, could be the boundary of the visible stuff, which we are part of. I hate the word ether but the principle is similar. it is the boundary conditions that is creating the fields which make up the oscillations (ripples).
So I ask, could this type of singularity exist at every “point” in the universe? Definition of point is beyond my math ability. Whether it dimensions or just a very small sphere of nothing with a continuous circular flow of “energy” which is forcing space to a closed curve and hence stop time (smallest unit of time) as well.
All the consistent measurements in LHC is not necessarily the physical reality of nature – this is what Mr Nima Arkani-Hamed has said, “naturalness is wrong” – and there is room for unnatural (extra dimensions?). ?
But super symmetry and string theories were existed to tinker the standard model ?
Max Planck tried to grasp the meaning of energy quanta, but to no avail. “My unavailing attempts to somehow reintegrate the action quantum into classical theory extended over several years and caused me much trouble.” Even several years later, other physicists like Rayleigh, Jeans, and Lorentz set Planck’s constant to zero in order to align with classical physics, but Planck knew well that this constant had a precise nonzero value. “I am unable to understand Jeans’ stubbornness — he is an example of a theoretician as should never be existing, the same as “HEGEL” was for philosophy.
Thanks for the fascinating article. Please keep being “long-winded”!
Prof. Strassler,
I commend you for doing a great job in keeping the general audience honestly appraised about the research work and its challenges at the LHC.
Many theorists nowadays choose to display a pessimistic attitude on the future of particle physics. It is important to remind everyone that we’re still early in exploring the Higgs sector and the possible signatures of BSM phenomena. “Throwing in the towel” is likely premature, considering the steady stream of data that will continue to come out of the LHC, searches at neutrino detectors and the on-going astrophysical observations.
Sorry. I did not word it properly. I am curious about -at what point researchers like you would conclude that there is no SUSY? Some
unnamed people are hoping that it may be at extremely high energy ,unreachable by any realistic accelerator! Thanks.
You’ve got to distinguish “there is no SUSY” (which it will be impossible to conclude in the next century or two) from “there is no SUSY at scales that are relevant for the hierarchy problem” (on which the LHC has already had, and will continue to have, enormous impact.)
Deciding when to conclude something for which there isn’t 100% proof is very difficult. It is a judgment call. I don’t think there will be a day when I say to myself: “today I’m absolutely sure there is no SUSY just out of reach of current experiments”. On the other hand, I never said, “today I’m sure there *is* SUSY just out of reach of current experiments”. This is something which at one point was somewhat plausible — and was once a bit more plausible than most other options, all of which were pretty bad — that will gradually become less and less plausible.
Generally, there’s no value to saying you know something for certain (i.e., drawing a “conclusion”) when in fact you don’t know. It’s my style to continue to keep an open mind. For what it is worth, I no longer consider supersymmetry at LHC energies more plausible than the appearance of something completely unexpected and unknown to humans.
@kashyap vasavada; “I am curious about -at what point researchers like you would conclude that there is no SUSY?”
This is indeed a very important issue of today. Professor Matt Strassler has given a very profession answer. Yet, for an old grandmother, this issue can be viewed in a different angle.
Her granddaughter (the sleeping beauty) will wake up if she can take the elixir which is locked in a vault (protected by 10 steel doors). The legend says that only the Witch of the West has a master-key (called SUSY) for all those 10 doors. Before sending her puppy searching for it, she makes a search analysis.
1. Her baby is in the bed called “universe” which has “four” parts.
a. visible part (VP) —- it has a “event-gate”. That is, every “event” in VP must be inside or through this gate.
b. invisible part (IP) —- outside of the visible part. Yet, any encounter (or contact) with this VP from IP must through this VP gate.
c. Contact horizon (CH) —- anything beyond CH has no “meaning” for getting the elixir.
So, the search issue is very simple. She must find out “one” answers. Where is the VP gate? By knowing that there is no “contact” at VP gate, the chance for her to get the elixir is nil. Of course, it will be nice if she know about the CH. If she knows that SUSY is beyond the CH, that SUSY has no “meaning” to her any more.
Thus, the issue is not about whether there is SUSY or not. The issue is about the VP gate (perhaps a tad about CH). If the Witch of West comes to visit this universe less than once in the lifetime of this universe, there will be no SUSY for the grandmother.
2. One day, one of her friend comes and opens all the 10 doors. She will instantly forget that SUSY legend.
SUSY (with s-particles) has been with us over 40 years. Among the voices, only Professor Matt Strassler stands for physicist integrate while many others have changed SUSY into either a religion or a name calling. One blog article made the following points.
a. The chance for LHC to discover SUSY is not more than 50% as … 100 Tev machine is needed for reaching the SUSY.
Note: This is a “catch-me-not-game”. If you stop trying to catch me, you lose. If you keep trying, I can always go one step beyond your reach.
b. Someone will defend SUSY even after his death.
Note: This becomes a great new religion, making the afterlife very meaningful.
c. With a super-SUSY equation:
anti-SUSY kibitzers = cranks + stupidity = anti-SUSY Mujahideens
That is, if he cannot win with reasoning, he can always win by name callings.
In addition to Matt’s points, Occam’s razor can settle this issue by reaching the “Bottom secret” of Nature by unlocking 10 locked doors with a master-key which consists of no SUSY (with s-particles).
1. The 48 Standard Model matter particles are now established as fact, but no theoretical base is known for their rising. This mystery is behind the first locked door.
2. There are many coupling constants (such as Alpha, α), [Neff = 3] and some free parameters (such as, Cabibbo and Weinberg angles). Again, there is no theoretical base for their calculations. These mysteries are behind the second locked door.
3. The (dark/visible) mass ratio = 5.3526 is now a certainty. What is the theoretical base for its calculation? It is a mystery behind the third locked door.
4. Quantum principle and Relativity are not incompatible. How can it be made compatible? This issue includes the gravity unification. This mystery is behind the forth locked door.
5. What is the theoretical base for giving rise to “charges” (e-, m- charge, etc.)? Why is e-charge unique while m-charge diverages? This mystery is behind the fifth locked door.
6. Universe is accelerating its expansion. What drives this? This is in fact the same issue of how the structures arise from a bowl of uniform soup. The mechanism which gives rise to structures will accelerate universe’s expansion. This is the mystery behind the sixth locked door.
7. Neutrinos are oscillations. What powers these? This is the mystery behind the seventh locked door.
8. Nature has three parts.
a. Material universe (Earth, Sun, galaxies, etc.), not including life.
b. Life
c. Numbers
Are those three parts governed by three disjoined sets of laws? Or they are governed by a unified set of laws?
Note: life has at least two distinct traits.
i. life process (reproduction and metabolism) which needs a computing device.
ii. individuality.
Are these two traits arising from the laws of physics? Or life gets them from somewhere else?
These mysteries are behind the eighth locked doors.
9. What is time and space? How do they arise? This is the same issue of how ħ (Planck constant) arose. These are behind the ninth locked door?
10. The Master-key test. Is the key for the nine doors above a Master-key or nine different keys? If it is not a Master-key, this tenth locked door cannot be opened.
With such a master key which consists of no SUSY, the SUSY (with s-particles) will simply cut out by the Occam’s razor. Yet, can such a master key be found? My view is a definitely “Yes”.
Even though there is the pro-SUSY camp and the anti-SUSY camp, so far, SUSY has not been (completely) ruled out by CMS and ATLAS yet, so, we have to wait some more years to see what pops out of the LHC regarding ruling out some forms of SUSY within certain mass/energy ranges.
Besides the fact that it could happen in the future that SUSY might turn out to be “not a right answer” for physics, it could still be very useful as a tool to solve problems and/or to think about certain things in HEP physics.
It’s very much like Newtonian physics: even though we know that is “not a proper answer”, it so simple and accurate enough that we still use it for many engineering calculations (within the energy/velocity range where it is valid enough).
We have to be very careful with the use of Mach’s philosophy of “in Physics, unobservables do not exist”.
Kind regards, GEN
Great summary of your talk. Question: if after 2015 (2020 if you want to stretch) run SUSY particles do not show up, would you give up hope or would you keep hoping for them to show up in ILC? I realize, I am asking impossible long term questions!
This is really not a question that makes sense to me. I don’t spend my time right now “hoping”. I spend my time making sure that until we find something, we’re searching efficiently. What’s the point of asking me how I’ll feel in 2020? I have no idea… it depends on many other things.
Hi Matt, I saw your response on Day 2’s post regarding data collection.
The amount of data the LHC has collected is astronomical. Is there so much of it that it is a Big Deal to come up with new ways to analyze and new things to do with pre-existing collected data? i.e. are there groups with competing proposals for limited computing time, or is it mainly just difficult coming up with workable, novel methods of data analysis?
Secondly, just something I found interesting. The number of recorded events vs. number of collisions is a pretty darn small ratio, but I’m still surprised at how wide the range of data people want collected is, wide to the point where we have to make choices far short of triggering on every interesting and research-worthy event. There must be a WHOLE lot of different things people are looking for and want more data on!
I’m finding a lot of things I was wondering about data triggering and storage and processing are answered in this article that Matt linked in his post up above: http://profmattstrassler.com/articles-and-posts/lhcposts/triggering-advances-in-2012/data-parking-at-cms/
I want to add that anyone can watch all the recorded seminars from the 2013 SEARCH Workshop right here: http://scgp.stonybrook.edu/search
I’m finding that the seminars are not overly technical, and are generally very accessible for people who have the layman’s knowledge to read and understand this site. What I’ve seen so far is comparable in difficulty to the Higgs boson live announcement presentation a while back. In other words, very enjoyable and not too heavy on inaccessible equations or jargon!
Matt, in the second point within the comment in parenthesis “(see my discussion of “naturalness”=genericity here)” it seems that a link is missing in the word “here”.
Kind regards, GEN
thanks!