As many of you will have already read, the Large Hadron Collider [LHC], located at the CERN laboratory in Geneva, Switzerland, has “restarted”. Well, a restart of such a machine, after two years of upgrades, is not a simple matter, and perhaps we should say that the LHC has “begun to restart”. The process of bringing the machine up to speed begins with one weak beam of protons at a time — with no collisions, and with energy per proton at less than 15% of where the beams were back in 2012. That’s all that has happened so far.
If that all checks out, then the LHC operators will start trying to accelerate a beam to higher energy — eventually to record energy, 40% more than in 2012, when the LHC last was operating. This is the real test of the upgrade; the thousands of magnets all have to work perfectly. If that all checks out, then two beams will be put in at the same time, one going clockwise and the other counterclockwise. Only then, if that all works, will the beams be made to collide — and the first few collisions of protons will result. After that, the number of collisions per second will increase, gradually. If everything continues to work, we could see the number of collisions become large enough — approaching 1 billion per second — to be scientifically interesting within a couple of months. I would not expect important scientific results before late summer, at the earliest.
This isn’t to say that the current milestone isn’t important. There could easily have been (and there almost were) magnet problems that could have delayed this event by a couple of months. But delays could also occur over the coming weeks… so let’s not expect too much in 2015. Still, the good news is that once the machine gets rolling, be it in May, June, July or beyond, we have three to four years of data ahead of us, which will offer us many new opportunities for discoveries, anticipated and otherwise.
One thing I find interesting and odd is that many of the news articles reported that finding dark matter is the main goal of the newly upgraded LHC. If this is truly the case, then I, and most theoretical physicists I know, didn’t get the memo. After all,
- dark matter could easily be of a form that the LHC cannot produce, (for example, axions, or particles that interact only gravitationally, or non-particle-like objects)
- and even if the LHC finds signs of something that behaves like dark matter (i.e. something that, like neutrinos, cannot be directly detected by LHC’s experiments), it will be impossible for the LHC to prove that it actually is dark matter. Proof will require input from other experiments, and could take decades to obtain.
What’s my own understanding of LHC’s current purpose? Well, based on 25 years of particle physics research and ten years working almost full time on LHC physics, I would say (and I do say, in my public talks) that the coming several-year run of the LHC is for the purpose of
- studying the newly discovered Higgs particle in great detail, checking its properties very carefully against the predictions of the “Standard Model” (the equations that describe the known apparently-elementary particles and forces) to see whether our current understanding of the Higgs field is complete and correct, and
- trying to find particles or other phenomena that might resolve the naturalness puzzle of the Standard Model, a puzzle which makes many particle physicists suspicious that we are missing an important part of the story, and
- seeking either dark matter particles or particles that may be shown someday to be “associated” with dark matter.
Finding dark matter itself is a worthy goal, but the LHC may simply not be the right machine for the job, and certainly can’t do the job alone.
Why the discrepancy between these two views of LHC’s purpose? One possibility is that since everybody has heard of dark matter, the goal of finding it is easier for scientists to explain to journalists, even though it’s not central. And in turn, it is easier for journalists to explain this goal to readers who don’t care to know the real situation. By the time the story goes to press, all the modifiers and nuances uttered by the scientists are gone, and all that remains is “LHC looking for dark matter”. Well, stay tuned to this blog, and you’ll get a much more accurate story.
Fortunately a much more balanced story did appear in the BBC, due to Pallab Ghosh…, though as usual in Europe, with rather too much supersymmetry and not enough of other approaches to the naturalness problem. Ghosh also does mention what I described in the italicized part of point 3 above — the possibility of what he calls the “wonderfully evocatively named `dark sector’ ”. [Mr. Ghosh: back in 2006, well before these ideas were popular, Kathryn Zurek and I named this a “hidden valley”, potentially relevant either for dark matter or the naturalness problem. We like to think this is a much more evocative name.] A dark sector/hidden valley would involve several types of particles that interact with one another, but interact hardly at all with anything that we and our surroundings are made from. Typically, one of these types of particles could make up dark matter, but the others would unsuitable for making dark matter. So why are these others important? Because if they are produced at the LHC, they may decay in a fashion that is easy to observe — easier than dark matter itself, which simply exits the LHC experiments without a trace, and can only be inferred from something recoiling against it. In other words, if such a dark sector [or more generally, a hidden valley of any type] exists, the best targets for LHC’s experiments (and other experiments, such as APEX or SHiP) are often not the stable particles that could form dark matter but their unstable friends and associates.
But this will all be irrelevant if the collider doesn’t work, so… first things first. Let’s all wish the accelerator physicists success as they gradually bring the newly powerful LHC back into full operation, at a record energy per collision and eventually a record collision rate.
137 Responses
I wonder if anyone else but me has thoughts of vacuum being somehow matter-like vacuum energy substance, substitute for matter exist.
Thus the energy of vacuum flow for the energy of matter. Dark energy could be dipole energy of vacuum energy used by matter. And dark mass, not matter, could be perturbances of vacuum energy field because of flowing energy units under causality balanced in geometry of gravitational dynamical changes.
Something like that – any familiar?
Eusa from Finland? We could have very fruitful conversations on that topic.
Matt, what could dark matter be if not particles?
Oh there are so many options. The most popular aside from particles is simply gravity not working how we think it does. (See MOND and TeVeS theories.) They’re not favorites but you never know.
Only a small part of the total quantity of mass in the universe is “visible”. I do not think “dark matter” is something mysterious. Not Wodan’s shit or Odin’s piss. We just need science to improve our vision.
‘Empty’ space has mass. You can label it dark matter, ether, aether, quintessence, quantum foam, quantum vacuum, plenum … it doesn’t matter.
‘It’ has mass which physically occupies three dimensional space and is physically displaced by the particles of matter which exist in it and move through it; including ‘particles’ as large as galaxies and galaxy clusters.
What ripples when galaxy clusters collide is what waves in a double slit experiment; the mass that fills ’empty’ space.
Einstein’s gravitational wave is de Broglie’s wave of wave-particle duality; both are waves in the mass that fills ’empty’ space.
The mass that fills ’empty’ space displaced by matter relates general relativity and quantum mechanics.
Why don’t we simply admit it, the Aether (dark matter and dark energy and vacuum energy) is back? John Ellis calls it a Higgs snow: https://www.youtube.com/watch?v=QG8g5JW64BA
Aether is back (if it ever went away) but it lives in the shadows (currently). Making waves (pun intended) about it useless though, better invent something new and “concrete” based on the idea.
“The word ‘ether’ has extremely negative connotations in theoretical physics because of its past association with opposition to relativity. This is unfortunate because, stripped of these connotations, it rather nicely captures the way most physicists actually think about the vacuum. . . . Relativity actually says nothing about the existence or nonexistence of matter pervading the universe, only that any such matter must have relativistic symmetry. [..] It turns out that such matter exists. About the time relativity was becoming accepted, studies of radioactivity began showing that the empty vacuum of space had spectroscopic structure similar to that of ordinary quantum solids and fluids. Subsequent studies with large particle accelerators have now led us to understand that space is more like a piece of window glass than ideal Newtonian emptiness. It is filled with ‘stuff’ that is normally transparent but can be made visible by hitting it sufficiently hard to knock out a part. The modern concept of the vacuum of space, confirmed every day by experiment, is a relativistic ether. But we do not call it this because it is taboo.” – Robert B. Laughlin, Nobel Laureate in Physics, endowed chair in physics, Stanford University
Matter, solids, fluids, a piece of window glass and ‘stuff’ have mass and so does the aether.
Label it whatever you want, ’empty’ space has mass which physically occupies three dimensional space and is physically displaced by the particles of a matter which exist in it and move through it.
In a double slit experiment it is the mass which fills ’empty’ space that waves.
Yes, taboo is the right word :-/ Mark my words… If one combines aether and physically spinning particles the rest will follow at ease.
How about combining Einstein’s gravitational wave with de Broglie’s wave of wave-particle duality?
‘Empty’ space has mass which physically occupies three dimensional space and is physically displaced by the particles of matter which exist in it and move through it; including ‘particles’ as large as galaxies and galaxy clusters.
What ripples when galaxy clusters collide is what waves in a double slit experiment; the mass that fills ’empty’ space.
Einstein’s gravitational wave is de Broglie’s wave of wave-particle duality; both are waves in the mass that fills ’empty’ space.
The mass that fills ’empty’ space displaced by matter relates general relativity and quantum mechanics.
Aether has mass, physically occupies three dimensional space and is physically displaced by the particles of matter which exits in it and move through it; including ‘particles’ as large as galaxies and galaxy clusters.
The Milky Way’s halo is not a clump of dark matter traveling along with to the Milky Way. The Milky Way is moving through and displacing the aether.
The Milky Way’s halo is the state of displacement of the aether.
What is referred to geometrically as curved spacetime physically exists in nature as the state of displacement of the dark matter.
The Milky Way’s halo *is* curved spacetime.
A moving particle has an associated aether displacement wave. In a double slit experiment the particle travels through a single slit and the associated wave in the aether passes through both.
Q. Why is the particle always detected traveling through a single slit in a double slit experiment?
A. The particle always travels through a single slit. It is the associated wave in the aether which passes through both.
What ripples when galaxy clusters collide is what waves in a double slit experiment; the aether.
Einstein’s gravitational wave is de Broglie’s wave of wave-particle duality; both are waves in the aether.
Aether displaced by matter relates general relativity and quantum mechanics.
Above should have read:
What is referred to geometrically as curved spacetime physically exists in nature as the state of displacement of [aether].
The Milky Way’s halo *is* curved spacetime.
There is evidence of dark matter every time a double slit experiment is performed; it’s what waves.
In terms of dark matter, there are two notions which are incorrect. One is that dark matter is a clump of stuff traveling with the matter. The other is that dark matter does not interact with matter.
Dark matter fills ’empty’ space. Dark matter is displaced by matter.
The Milky Way moves through and displaces the dark matter.
The Milky Way’s halo is the state of displacement of the dark matter.
The state of displacement of the dark matter is otherwise known as the deformation of spacetime.
The Milky Way’s halo is the deformation of spacetime,
So, we don’t need the LHC to find evidence of it, it already exists. We just need physicists to correctly understand what occurs physically in nature.
Hello liquidspacetime/Mike sock puppet! 😀
Hey Matt,
what’s the production rate of the Higgs at 13Tev compared to 8Tev in 2012?
Yeah, and how many particles are inside the proton? How many repeating patterns are their within a fractal?
If you think about it, the production rate is unlikely to be very dependant on the energy level, but more on other factors. Do some googling, like http://www.lhc-closer.es/1/3/9/0
Dark matter is probably made out of neutrinos. Despised by LCDM, but that’s just a theory with its own problems, where are its WIMPs? The case of neutrino dark matter is only, though severely, ruled out sociologically. So all the fuzz about “discovering dark matter” could better be prevented.
I don’t see how you can claim this, unless you mean non-Standard-Model sterile neutrinos.
While I think theons’ language is a bit strong (‘despised’ – I don’t think so theon) he does have a point – Dark matter may (I said may) be neutrinos and it need not involve sterile neutrinos. On the purely experimental side – direct experimental limit on the electron neutrino mass < 2.3 eV. While tiny compared to the electron itself (511,000 eV) it is still ~10 times larger than the latest summation estimate implied by Planck data in conjunction with present theory < .23 eV. Experimentally, the matter will be settled very soon as the KATRIN (see katrin.Kit.edu) experiment is expected to be taking data in 2016.
However, theon, if KATRIN finds an electron neutrino mass of say ~1 eV it may very well solve the dark matter question, but it would raise many , many others as despite your claim the current estimate of < .23 eV is NOT sociological.
As to the LHC I celebrate the fact that the great machine is coming alive again and look forward to any discoveries it may make at its eventual 13 TeV energy. However, if it finds nothing else it has already made a great discovery and should need no further justification for its existence.
“Dark matter is a hypothetical kind of matter that cannot be seen with telescopes but accounts for most of the matter in the universe. The existence and properties of dark matter are inferred from its gravitational effects on visible matter, radiation, and the large-scale structure of the universe.”
What if the “inferred gravitational effects” are not caused by dark matter, but a dual-toroidal vortex in the fabric of space time powered by a black hole dynamic?
Information is conserved, so whatever falls into a black hole the same total information comes out.
There is evidence of this in MANY of the Hubble telescope galaxy photos. And there is a very good paper on it here: http://hiup.org/wp-content/uploads/2013/05/torque_paper.pdf
Including torque and Coriolis effect in Einstein’s Field Equations fully accounts for the dark matter and dark energy.
fully accounts after one properly handles the vacuum energy density:
D=M/V
M= Planck mass
V = Planck Spherical volume of diameter Planck Length
D=6M/(piL^3)
D=Very large number ~=10^97 kg/m^3
and the aether that is now the vacuum of spacetime is a super fluid like media quantum foam on the Planck scale then all is accounted for.
So, we live in the present in sort of an event horizon…
Any questions?
Yeah, vacuum energy density appears to be constant with regards to your formulae. How does this explain phenomenon that show non-homogeneous gravitational effects that do not even correlate with visible matter?
The presence of black holes powers a dual-toroidal dynamic – this is a gravitational effect that not correlate with visible matter, except for when you include black holes giving out as much as they take in, so they are “visible”, yet power a dynamic that is somewhat similar to water twisting and spiraling down the drain. Stars in a galaxy follow this dual-toroidal pattern, and the math behind it results in a fibonacci pattern spiral of galaxy formation, as nature does below as well here on Earth.
The black holes are in this background vacuum energy density. Matter vibrates into and out of this vacuum energy density. We are truly exchanging our information with the event horizon of the vacuum of space that we are resonating within.
Aren’t you the least bit concerned that the same theory that CANNOT handle ONE proton (see proton radius problem/puzzle) is being used to analyze the collision of TWO protons?
I’m not sure how to make sense of your statement. Scientists approximated the size of a proton using electrons and got one solution (so often and so precise that it was considered accurate), then approximated the size of a proton using a different method using muons and got a different answer that was 4% off of the accepted value using electrons. There are many possibilities as to why this happened – including an error in the experiment or the assumptions made by the calculations in the experiment. It’s possible new physics is involved, and people have ideas about what that may be, but it’s hardly a crisis – nor does another theory beyond the standard model predict or explain the phenomenon any better. I don’t see how that rises to the level of “this theory can’t HANDLE a proton!”
As for analyzing proton collisions, that’s done with sensors/detectors and computer models. The LHC was designed to look for things the standard model predicts, but also to search for things beyond the standard model – or to find things it does not predict (or even that the model says should not exist!) I think it’s flawed to presume the standard model is in any way negatively influencing the analysis of what’s being detected by the particle accelerator. If you have a better way to detect or analyze such collisions, you should put forth a proposal for the LHC and/or its successors.
As for Nassim Haramein, he’s an ancient astronaut “theorist” and amateur physicist (IE no credentials whatsoever) whose unified field theory presumes protons are miniature black holes. It’s been debunked clearly as nonsense that does not match up to reality. http://azureworld.blogspot.com/2010/02/schwarzchild-proton.html
I watched a few of Haramein’s Youtube talks, and his pseudo-science mumbo-jumbo is humorously nonsensical and shows he has a lack of understanding of some very basic principles of physics.
I’m surprised anyone listens to Haramein. There are more prominent new-age charlatans such as Deepak Chopra that delve into quantum mechanics if you’re into that sort of thing.
N-body problem is what I’m talking about. The theory does not handle 1 proton accurately (thus it is named proton puzzle), so you think the theory handles 2 body, 2 proton, collisions?
My apologies, phxmarker, I assumed you were referring to the aforementioned proton radius issue. Still, I can’t make any sense from what you’re saying.
The term “N-body problem” is used to refer to the highly complicated interactions between “bodies” — Like gravitational interactions between stars, moons, planets, etc. Each exerts a tug on all of the others which disturbs the orbits for each body. As you add a new body to the mix, you complicate the interactions. N=1 makes no sense unless something interacts with itself. N=2 is really simple. Higher Ns, higher complexity. The same is true for other n-body problems besides gravitational ones.
I have no idea how this could be applied to a single proton by itself as the problem by definition requires more than one “body” (ie more than one proton) to make it a problem, much less how it wouldn’t be “handled accurately,” but I’m all ears.
As for the LHC, it accelerates large groups of protons travelling in different directions. These protons collide a bit randomly and sensors pick up what happened. What do you propose they are doing wrong? Do you not believe the protons are accelerating and colliding or is there some specific part of the analysis of those collisions that you disagree with? I see no issue with smashing things together and detecting what comes out, and I don’t really know how any “theory” would change the results. Either something happened and a detector went off or it did not. The only thing the LHC does is conduct experiments and collect data. So far, the data supports the current standard model theory. There’s no reason the theory should affect the results, though perhaps the interpretation of the results could be biased.
Now, keep in mind that Haramein whom you seem to hold in high regard also tells his Youtube audiences that people can simply live off of the “vacuum energy” without needing to eat or drink and claims he has done so himself for a time. I recommend against taking anything he says as factual information for the sake of one’s health if nothing else.
When the protons collide, 2 protons, 2 bodies:
“A 7 TeV proton–proton collision in CMS yielding more than 100 charged particles”… (from: http://www.lhc-closer.es/1/3/9/0 )
2 protons collided yield more than 100 charged particles…. > 102 bodies
So, mathematically, if the theory cannot analyze a SINGLE proton (proton puzzle – something is amiss) with a single muon -actually a two-body problem, and the math and analysis is so fuzzy on that we have to wait until 2017-2018 for MUSE results, however, in the meantime, we proceed with a much more complex Higgs characterization, and CLAIM the measurements match the theory?
Wayne, also, what is wrong with what they are doing? What would be wrong with smashing a tornado apart looking to see what the tornado is made of? The tornado is simply a disturbance in the air.
A proton is simply a very stable disturbance in the super-fluid like vacuum energy density dynamic – vacuum energy proven to exist by the Casmir effect and the dynamic Casmir effect.
So, analogies fail quickly, however, I must point out the proton is nothing but a dual-toroidal vortex in the dense super-fluid crystalline like fabric of spacetime, an aether, and these high energy proton collisions are simply like breaking apart a tornado trying to find what the tornado is made of.
Wayne, one more thing, it helps to keep in mind, the dynamic Casimir effect has been proven – photons can be extracted out of the vacuum. Simply vibrating the two parallel non-touching closely spaced plates generates photons, and with photons, matter can be created via gamma-gamma, so this is theoretically possible.
So, yes, the vacuum energy does exist, and the Casimir effect is the proof.
http://en.wikipedia.org/wiki/Casimir_effect
Very pleased at the return of one of my favorite blogs. Looking forward to hear more, and having this unique insight into the next run of the LHC. Many thanks and all the best to you.
Hello Matt,
I have a question that is somewhat off-topic, but concerns dark matter and the idea that the majority could be wrong.
I am a undergraduate student of physics and some time ago I heard a talk about dark matter, more precisely about why the standard model of cosmology is wrong (especially the dark matter part). He gave various examples, but his main case was that with dark matter one would expect two kinds of dwarf galaxies (some formed by the attraction of a dark matter halo and some formed mainly by “normal” matter), but we observe only one kind of dwarf galaxies.
As possible alternative he presented the idea of modified Newtonian dynamics. It seemed pretty convincing (and exciting) for me, as a person with no deep insights into the topic. I know MOND has some problems, for example finding a generalization in the relativistic regime.
According to my astrophysical lecture, the CDM-model is well established and widely used. Although as it seems it has troubles too. But most ignore this and keep using CDM as it were proven.
Now my question; can I trust the scientific process in that the majority is right? Is it possible that a weak ideas becomes popular, only because of lack of a better explanation?
(I know I am asking for a personal opinion here and I probably should ask a astrophysicist instead of particle physicist, but I’ve come to respect your thinking while reading your blog.)
First of all, while MOND is not mainstream, several prominent astrophysicists support it. Second, many who don’t believe its basic premise believe that there is still something to it. I think it is fair to say that the establishment has not responded enough to this. MOND people come up with detailed observations and some mainstream pundits claim that they follow from CDM, but all they present is an order-of-magnitude calculation, nowhere near the details of what the MOND folks talk about.
Also, my experience is that the typical MOND astrophysicist has a both broader and deeper command of conventional astrophysics than the conventional astrophysicist does of conventional astrophysics, and certainly much, much more than the conventional astrophysicist has of MOND.
There are several good reviews on MOND; look for Sanders and McGaugh.
On the other hand, MOND is phenomenological and explains some known things, but is weak on falsifiable predictions (as opposed to postdictions) which have been confirmed. It is also based on the assumption that dark matter is somehow a bad idea. To me, the alternative, namely that everything in the universe should be immediately obvious to humans, is more absurd.
Yes, there are problems with some CDM scenarios. But don’t confuse an argument over details with an argument over the principles. (That’s like creationists saying that since Gould doesn’t agree with Dawkins, evolution has been disproven.) The basic idea that we don’t yet know what all components of the universe are is, I would say, pretty secure—and no stranger than biologists in Europe 500 years ago not knowing about gorillas.
MOND folks often complain, rightly, that CDM in some cases works better because it has benefited from more research. This is true, but the converse is as well: CDM, in the messy details, in some cases runs into problems (not really fundamental ones) which MOND has not yet reached.
Who gave the talk?
Thanks for the hints for further reading.
The talk was given by Pavel Kroupa. I think it followed this paper: http://journals.cambridge.org/action/displayAbstract?fromPage=online&aid=8794840&fileId=S1323358000001417
Lets be clear , all 100 theories on gravity are junk . MOND is not too bad , nor MOG and others. Why ? The less the 2nd body moves off the line of sight with the 1st , the greater gravity. 1. Between two galaxies centers, a factor of 10. 2. Between the center and a star 2. 3. In the bar of a galaxy as high as 3. 4. In the solar system 1. But even here we have the Pioneer Anomaly. And every galaxy is warped from the center of another. So what is common, the line of sight. The less the bodies move off the line of sight , the greater gravity, just as MOND states. Now instead of Dark Matter, just increase the strenght if the bodies do not move off the line, the less movement the stronger. Why ? Because the graviton move faster if thru the SAME space. WELL the theory, LSG (Line of Sight Gravity) goes on and on from here, But now someone needs to be interested.
Ah ha ha ha ha! It’s been so long since I’ve heard the Pioneer Anomaly name-dropped that I’d almost erased it from my Crackpot Bingo card. But I knew not all of you would have gotten the message.
Hey, does your theory predict a slight difference in the gravitational force on the Pioneer Probe to explain the anomaly? Well guess what? Since the initial naive estimate, full finite-element analysis of radiation emission/reflection off the probe’s asymmetrical structure has shown that the observed acceleration matches the expectation. Meaning your theory predicts a deviation *that is not seen*. Congratulations, your theory is wrong, and you can now spend your time on something else.
Does your comment mean you are interested ? Or is it like almost all the rest, just a comment ? About the Pioneer Anomaly: Dr T made the statement 7 years earlier to Dr Moffat (MOG) that the anomaly was due to the spacecraft energy sorce Like most, he believes in GR an other gravity theories can not be true. His estimate of force needed is still (today) 17% off. And the anomaly DID NOT start untill the spacecraft passed the 7 planet ; The energy source started on day 1. As is with most comments the details are again overlooked. Now explain the factors of 10, 3 and 2 instead of the one at 1.0005
The source is thermal radiation and why would you assume that this was constant? And why would you assume that the anomaly wasn’t present at all rather than becoming noticeable above the statistical noise level only once other forces were reduced? Right, because it lets you assert that it couldn’t have been thermal radiation, and rather new physics. But neither of those assumptions are true!
And you’re interested in the details? Try arxiv.org 1204.2507 and 1311.4978. They actually address your concerns/assumptions: The variance in the thermal profile of Pioneer, and the effect on the trajectory at a wide range of distances from the sun. They find no statistically significant remains.
If you’d like to point me at Dr. T’s calculations that take into account those analyses and finds fault with them, then yes, I’m interested. If you’re just dropping detail-less-details to hint that it’s still totally a real anomaly and so your theory is right and you don’t need to address work that says otherwise, then no, no I’m not interested.
“On the other hand, MOND is phenomenological and explains some known things, but is weak on falsifiable predictions (as opposed to postdictions) which have been confirmed.”
Pre- vs post- is pretty much a red herring, though. If you have a rigorous and specified theory then it doesn’t matter what order you make your observe a scenario and calculate the expectation for that scenario from the theory.
What MOND is weak in is matching observations of the largest-scale (and thus least susceptible to local variation and easiest to compare to theory) phenomenon — the CMBR and the large-scale structure of the universe. DM-less MOND is extremely far off.
Of course very clever people are working on it so who knows what may come up with. I wish them the best. But as of today the best explanation for observations is LCDM.
I heard a good talk about MOND about a year ago, see http://profmattstrassler.com/2014/05/27/dark-matter-debates/ . I would think of MOND as unlikely but not quite disproven, and of dark matter as very likely but not proven. There is a difference between having an open mind and an empty head: study the evidence yourself and you will see that the evidence for dark matter comes from many different sources and is very strong, while MOND theories are a mess and can’t explain quite a bit of the data. But it’s good to have an alternative option to the mainstream guess, and until that guess becomes definitive, the alternative should be kept in play.
I agree in general, but I just want to make it clear that MOND is not just “change gravity instead of postulate dark matter”, but rather does explain some things quite naturally which are not obvious consequences of a CDM scenario. Yes, there is no valid relativistic theory, but on the other hand it is more than just another explanation for a flat rotation curve.
Although not relevant, a hundred years or so ago there was a debate not about the dynamics of galaxies but about the solar system. One explanation was dark matter, namely the planet Vulcan. This proved not to be the case (though it was for understanding the motion of Uranus, which led to Neptune; Neptune and Pluto are more ambiguous). Rather, modified gravity, namely GR, won out.
Agreed. By the way, Vulcan wasn’t the only “dark matter” option. A cloud of dust surrounding the sun was another possibility.
First, I am still not convinced they have found the Higgs boson, but assuming there is something stable at 125 GeV, is there any correlation between this resonance and the W and Z boson residing at 80 GeV and 90 GeV, respectively?
Is there any creditable theory that postulates dark matter could be gravitons?
“Is there any creditable theory that postulates dark matter could be gravitons?”
No.
Are you responding to the dark matter question or the part about the Higgs? 🙂
If you know what you are searching for, more often than not you will find it.
Gravitons are massless (otherwise gravity would not be long range) but dark matter must be made from massive objects (otherwise it would not clump.)
Hi Matt, it’s very nice to find you back at blogging! Regarding dark matter, could it be possible that there is some other underlying mechanism for the observations than massive objects? Perhaps an overlooked ordinary phenomenon or such which enhances the effect of gravitational interaction?
There are a range of alternate theories for dark matter either involving ordinary matter, exotic forces or a change in gravity itself. A big problem is that dark matter behaves like… well matter. It forms halos and clumps and clusters around normal matter. The idea that it is just ‘mysterious extra gravity’ is grossly simplified and when you look at the finer details a lot of alternatives are ruled out. (For example if it were just cold dust the dust would block much more light. If it were bigger clumps like asteroids or orphan planets we wouldn’t detect the halos we do. MOND (Gravity tweaks) tend to be too ‘smooth’ to work out…)
As I understand it, the larger the distance from us the more dark matter. Is my understanding correct?
Not really. That would imply that we are in some special position with regard to the dark-matter distribution, that it must be distributed spherically around us and that we are in a low-density region.
What you probably mean is that the larger the scale surveyed, the larger the fraction of unknown matter (dark matter, really a misnomer for “non-baryonic matter we don’t yet know enough about for which transparent is a better description than dark”). This is true, up to a point. At the scale of superclusters of galaxies, much smaller than the scale of the observable universe, this increase levels off.
“It forms halos and clumps and clusters around normal matter.”
To me, that statement implies that there is a sensible possibility for some phenomenon among ordinary matter which enhances the effect of gravitational interaction.
It’s still not completely ruled out, but the problem is that it doesn’t behave like normal matter. There’ no anomalous gravity from the sun or in (most) globular clusters, so it can’t be a phenomenon of all matter; it forms halos that don’t condense like the disk of the galaxy so again it’s not behaving like ordinary matter that really likes to stick together. And different galaxies have different amounts, some dwarf galaxies are almost all dark matter and no visible matter.
The best ‘normal matter phenomenon’ would be various MOND (Modified gravity) theories. The big problem these face is that simply having gravity behave differently should have a measurable effect at small scales. (The ‘Pioneer anomaly was thought to be one of these but turns out to be something different.) Dark matter is to diffuse, to ‘puffy’; it seems to form clouds and not lumps, like steam that refuses to condense into water. This is hard to explain.
Significantly predating the standard model, the German physicist B Heim produced a theory for elementary particles that predicted the masses and lifetimes with unerring accuracy, but he also produced some unusual leptons, including 5 neutrinos and a “neutral electron”, slightly more massive than an ordinary electron. Now the interesting thing about a neutral electron is that being a fermion it would not clump, in fact with such low mass its gravitational attraction probably would not lead to clumping anyway. If it were its own antiparticle, and if by doing so, it did not self-annihilate, then if in addition we assume the presence of matter was due to the asymmetry of annihilation with antimatter, then the excess of dark matter even from such a low mass particle is at least plausible. In one sense Heim predicted dark matter. That does not mean he was right, but it is interesting.
Yes, I know when I ask an obscure question I tend to get disparaging replies, but by asking a question, that does not mean I advocate the assertion, but rather I am asking do we know this cannot be correct? In this case, de we know the neutral electron cannot be correct.
Yes we know. And your original statement about Heim isn’t true.
Matt, “Yes, we know,” was a bit cryptic, and I am not sure what it refers to. If it refers to the neutral electron, and since you are doing more posts on dark matter, I would really appreciate it if you could expand.
I am sorry if the part about Heim is not true; all I know is what is on the web and I tried to state that truthfully. If there is an error, again I would appreciate knowing what the truth is.
Regarding Heim and what is on the web — check the credibility of your sources!!! There’s an immense amount of wrong information about particle physics on the web.
Ok, but could there is still hope for the gravitons?
Here’s the reasoning; When the fermions, mass particles quarks and leptons, interact via the electromagnetic force, the strong force, or the weak force, they do so by emitting or absorbing a boson force carrier particle (respectively a photon, gluon, or [ Z | W ] boson).
So, would not the mass particle(s) that make up dark matter behave similar by emitting and absorbing gravitons? And if so, here’s the $64,000 question, if fields are fundamental which in turn create particles, would not the graviton, indeed, be the most fundamental of all particles? And everything else would have been derived through refraction and/or scattering?
This is why I asked the question, does the “Higgs boson” residing at 125 GeV and so close to the W and Z bosons, could there be a relationship between the three resonances that would in turn create and maintain stable fermions? Thank you. Professor.
Of course massive particles interact via gravitons. That’s pretty much the definition. But it has nothing to do with “dark matter could be gravitons”.
Well, dark matter means missing mass, so I suppose that dark matter also means missing gravitons (by definition).
I don’t know how firmly your tongue is in your cheek (or anywhere else, for that matter), here. 🙂
Actually, it is exactly the opposite. The mass is “missing” because we don’t directly detect it, but we do detect its gravitational force, which is how we know about it in the first place. So, if you like, missing mass is detectable via gravitons.
But “gravitons could be dark matter” makes it sound like gravitons are massive. I don’t think this is completely ruled, as it is also not ruled out for the photon, but the upper limits on the mass are so low that, no, they could not be the dark matter.
“Of course massive particles interact via gravitons. That’s pretty much the definition. But it has nothing to do with “dark matter could be gravitons”.”
By definition every field has a force carrier associated with it, which is the point I am making. And yes, could have every thing to do about dark matter, since dark matter makes up the majority of all matter in the universe and is associated with the gravitational field so clearly it would imply that gravitons (gravitational field force carriers) are the “glue” that keeps dark matter in “clumps”. It would also explain why gravity is “long range” or should I say, would explain the dark matter “islands” in the universe because the gravitational field is long range. The “dark” feature may not be as dark as we think, it could mean that the field is so small we don’t have the “sensors” to detect it yet.
So going back to my initial question, do these resonances, W, Z and possibly the Higgs boson(s) interact, since they are so close in GeV, and cause quantum confinement, stable fermions? And similarly, what resonances in combination with the theoretical graviton could be causing dark (low intensity) matter?
Here’s one more very speculative thought, the “dark” feature could be that the visible universe (i.e. our sensors, including our eyes) are saturated with the high intensity EMF waves that we cannot see those minute gravitational waves being emitted from dark matter. And indeed, these gravitational waves are linking the “islands” of dark matter throughout the universe giving it the structure which it has.
Sorry, I have to rest my brain, now. 🙂
You have to be careful to separate virtual and real particles here. Two particles interacting can be said to exchange bosons but that is not the same as a real boson being present the entire time. When you look at a hydrogen atom you do not, in general, say that part of its mass is due to photons and positrons even though you can easily conjure situations where the proton and electron interact via them. Technically real particles aren’t being emitted and absorbed, only the messy virtual field scribbles. Thus it’s often best to see the whole ‘soup’ as part of the electromagnetic field.
Likewise massive DM particles would not have part of their mass as gravitons ; the gravitons would be a *result* of their mass and best treated as part of their gravitational field.
As for the graviton being the ‘most fundamental’ of particles, no. All fundamental particles are equally fundamental. you can, with some difficulty rewrite everything in terms of gravitons in the same way you can make Earth the center of the universe with everything moving around it. But you can do the same with any other fundamental particle.
The unique thing about gravity (we think) is that it affects everything with energy. That is, as far as we know, it can interact directly (but very weakly indeed) with any other particle.
The masses of the W\Z and Higgs are not unrelated; these masses ‘tether’ the plausible values of the Higgs. Before the LHC discovery it was known from the W and Z masses that the Higgs ‘should’ have a mass of less than 145 GeV. Had it not been so close then we would have had some interesting questions to answer.
“You have to be careful to separate virtual and real particles here.
… Technically real particles aren’t being emitted and absorbed, only the messy virtual field scribbles.” 🙂
Sorry, but I find it amusing when I read someone using the term “virtual” to describe a real event. Everything is real, everything is happening “now”. Virtual is the product of our incomplete visualization of reality, i.e. renormalization to enable the quantization of the classical fields to quantum operators. Yes, we would still be drifting in the ocean without these engines but there are drawbacks, limitations, when converting, approximating, an analogue process to a digital one.
Virtual particles are indeed real but get dissipated before the event even get completed, but it does it’s job in driving the process forward to a point when it cannot return back to the initial state. If the two state are different then the disturbances that acted alongside were real too with a very short lifespan. Where did this energy go? … Dark energy, maybe?
I once asked the Professor on this site, are we, the visible universe, the whitecaps of oceans waves, created and driven by the huge dark energy in the universe?
‘Virtual particles’ is indeed a bit of a stupid name. (Like ‘imaginary numbers’.) But the difference between them and ‘real’ particles is important. A ‘real’ particle is well defined, notably it has a well defined energy. When you ask ‘where did the energy go?’ in relation to virtual particles I can say ‘There was no energy.’; if you try and measure the energy of a virtual particle you can easily find it giving you an answer of zero.
And virtual particles are everywhere. It is accurate to say there exists a perfectly zero field everywhere with a virtually infinite number of virtual particles popping up and vanishing randomly all through it, or is it more accurate to say the field itself oscillates randomly? In my opinion the second is neater, more simple.
Also I would note that reversible processes also involve virtual particles. ALL processes do, one some level.
Gravitons are certainly the ‘glue’ holding dark matter together, without gravity it is likely dark matter wouldn’t aggregate at all. But this is no more a connection than that between ordinary matter and gravity. Notably gravity CANNOT be an infinitely long-range force AND have massive particles. To be considered part of gravity at all dark matter particles would have to be some sort of alternate gravitational boson, like how the weak force has three force carriers. But we know gravity has only one so that is not possible.
“Notably gravity CANNOT be an infinitely long-range force AND have massive particles.”
I think it can be and you agree with me when you wrote:
“And virtual particles are everywhere. It is accurate to say there exists a perfectly zero field everywhere with a virtually infinite number of virtual particles popping up and vanishing randomly all through it, or is it more accurate to say the field itself oscillates randomly? In my opinion the second is neater, more simple.”
Another way of saying what you wrote here is, gravity permeates in all of space.
Try and visualize it, there are massive “islands” of galaxies that fill the universe, (btw, “cosmic web” …youtube, is an excellent graphical depiction of our universe), and we know it is expanding, relatively speaking but that’s another topic. This model is so similar to waves in the oceans, again the whitecaps are the galaxies clusters, the carrier waves are “dark” matter the ocean is “dark” energy.
So, when you write “the field itself oscillates randomly” … this field is the gravitational field and no it does not oscillate randomly, i.e. chaos has a cause too. The oscillations of this field are caused by this vast and complex system of the universe, like a 3D maze of spring/mass mechanical system all linked together by the gravitation field. And I believe gravity and space are synonymous, one field, gravitational field and as space expands, (one large big bang?, many little bangs?), and as the space rapidly expands ripples start forming through refraction and start to superimpose hence creating “rotations” (closed trapped waves) and the particles we are made up start to take shape are trapped, stable. for as long as this maze stays together. Much like a knot (particle) on a single spring (field) , if you keep the string taught with just enough force the knot will hold but if you pull hard enough the knot will release and oppositely, if you relax the spring the knot will loosen and collapse (open).
BTW, be careful how you define “zero”, above. The false assumption is that “there is a perfectly zero field everywhere” . NO, if you add up all the energy in these ripples, including fermions, bosons and “virtual” particles (I like to think of them as transients), that’s a lot of energy! And no it does not vanish or cancel out, it is ALWAYS there till “time infinity” (I have no definition of this, but it’s a long time, :-)). I say this because if there is anyway of it cancelling out, i.e. unstable vacuum, the universe would not exist.
No, it cannot and I do not agree. Virtual particles are everywhere. This means ALL fields are everywhere (That we know of.) Weak, strong, electron, gravity… gravity is not something special in that sense.
For a force to be infinite in range its virtual particles need to be indefinitely stable, that is, the theoretical lowest energy virtual particle needs to have no energy. Remember, a big point about such particles is the more energy they have the shorter their lifetime.
There is a big problem with the ‘energy’ of virtual particles as a whole; when you add up what it ‘should’ be you get a massive, MASSIVE result, enough to crush the universe into nothing in a second. The actual vacuum energy is far smaller. What’s the deal? Nobody knows yet but it seems that you can’t just ‘add up’ virtual particles and you definitely can’t say how much energy they contribute, not until we start getting some sensible results.
I was under the impression that the LHC’s beams travel at nearly the speed of light, but the liveblog of the restart noted the beams reaching various points around the ring, several minutes apart. Have I misunderstood something, or is this an artifact of the low power at which the initial restart was done?
Yes, the protons travel at almost the speed of light. What took minutes was the alignment process of the accelerator. When they start putting the protons in, the magnets aren’t in exactly the right place and the protons end up spiraling out of the beam pipe. By adjusting the magnets they eventually get the protons to go all the way round the circle. That takes some time.
Everybody hopes that LHC returns good results about supersymmetry, that also you gains with it (about your many and magnificent articles). Let’s wait for it.
If dark matter particles were elementary, i.e. do not degrade to anything, or at least to anything not dark matter, and did not undergo the weak interaction, I assume the only way to locate them would be through recoil. Is that correct? If so, would the collision probability be high enough to get a significant number of collisions? Also, I believe the LHC discards a very great per cent age of the data, simply because there is too much for the computational/storage power. If that is correct, would that not make it even harder? Finally, on dark matter, leaving aside how we do not know what it is, is there any reason, other than there is a very localised very high energy field, to expect it could be formed? Is there any reason to believe that matter that is subject to the weak interaction could generate something that is not?
One last question. The standard model has three families, but suppose that is not the limit. If not, a new series of mesons might be possible. In this case, how do we know what has been labelled the Higgs boson is not a spin zero meson with a new quark?
Caution #1: “If dark matter particles were elementary, i.e. do not degrade to anything” — there is no connection between whether an object is elementary and whether it decays to other objects. The muon is as elementary as an electron, but decays to an electron, a neutrino and an antineutrino.
By assumption, dark matter is extremely stable (since it has survived 10 billion years) and so the only way to infer its presence at LHC is through recoil. Your initial “if” statements aren’t necessary.
It is possible that the collision probability could be large enough for discovery, but by no means certain.
It is true that the fact that the LHC experiments have to discard much of their data could make the discovery more difficult. But of course people are thinking about this in detail. And whether there is a challenge depends on how the dark matter particles are produced, which could be in a variety of ways.
Whether dark matter can be produced is not dependent on whether it carries the weak nuclear force. It might carry some other force — such as the Higgs force — or some other force we haven’t yet encountered. The Higgs may potentially decay to dark matter. But there are many possibilities.
We know already that the Standard Model does not have a fourth family like the first three; a fourth would have had a huge effect on the Higgs particle. Meanwhile, if you did add a fourth family and try to make a neutral Higgs-like particle out of a new-quark/new-antiquark pair, then there would be a spin-one particle with almost the same mass, as in the eta_c vs. J/Psi or the eta_b vs. Upsilon. [Or if you wanted to make it our of a new quark and a known antiquark, there would be another electrically charged Higgs particle with almost the same mass… just as there are charged and neutral D mesons, and charged and neutral B mesons.] Lots of other things would go wrong too, but in any case, these problems are immediate death to the idea.
Yes, if there were a fourth family there would be a number of other particles required, and the standard model would require serious revision at best. However, having said that, if it were a new quark and a standard one, are the extra ones necessarily that close by? If I understand it correctly, we only just found what we call the Higgs. If there were further quarks, there should be a good chance of finding suitable further particles, but their “signal strength” or probability of formation is presumably very low. However, my real question was, how do we know what we found is the Higgs, as opposed to some other boson which might arise of the standard model were faulty? I guess it does not really matter, because if the argument is there is something wrong with the standard model, we should wait and see what comes out of the enhanced LHC results. At the risk of being perverse, perhaps the best outcome for the standard model is we don’t find anything new.
I’ve written about this extensively. See for instance http://profmattstrassler.com/2013/03/15/from-higgs-like-particle-to-standard-model-like-higgs/ and the links therein. You’re not focused on the right issues yet.
It seems we are driven by this notion that we need to go bigger and bigger to achieve more, like bigger and bigger colliders. However, the ICF (inertial confinement fusion experiment at NIF is creating fusion at very small space, easier to control conditions and hence more accurate data.
I wonder, at LHC is the rate of collisions (curve fitting) more important than the conditions of impact and is this the reason for the large scale of this machine? Can’t we use the high speed particles coming at us from the sun to do these experiments in earth’s orbit at very small packages?
Hey Matt, if you need help to get back on the right track, not only could I help you after seeing your video: http://www.grassrootstv.org/view?showID=11527, but Matt Foley, motivational speaker can help you make something with what you have left after the Standard Model fails…
http://www.hulu.com/watch/4183
😉 best wishes and godspeed at verifying Higgs flavors…
Matt Phooley aside, we are all hoping that the Standard Model fails — since we already know (quantum gravity, dark matter, dark energy, neutrino masses) that it is wrong. We just don’t know if we’ll see signs of failure at the LHC itself.
If you are truly hoping it fails, then why are you so slow to test and learn the new unified physics that has been around for a few decades?
Many of us have known the solution to these problems for quite some time and have been waiting for the day to get your attention and to provide a little guidance.
“The Eagle has landed”.
Please provide a falsifiable prediction which differs from that of conventional physics.
Matt, where is the conceptual error?
I bow down before your exceptional genius.
It’s about doing the work. No bowing down needed. When you are ready…
Oh yes. I am sure it is all about doing the work.
Yes, Matt, it is about doing the work. If you won’t do it, we will.
I apologize in the sense that this is a reply to phxmarker. If there is an alternative route to calculate the properties of the proton, to be of any use it has to do something for the other elementary particles. Perhaps pxhmarker could enlighten us. I recall once seeing equations that predicted very close to exactly the masses of the proton and neutron, but when I tried it out on a meson, it was not even in the right direction compared with the proton. One can always get accidental coincidences.
Matt, I’ve thoroughly enjoyed your site for quite some time now. Thank you for taking the time to describe such complex physics in terms laymen can understand. It’s both interesting and insightful. I look forward to surprising discoveries from the LHC and your take on what we can infer about the standard model in their wake.
Also, thank you for your patience with the fringe science/crackpot theorists. You tolerate others with differing hypotheses with grace. While they can be a bit spam-y and at times arrogant, they can be a great source of entertainment. I often find myself researching their ideas to find out exactly why they aren’t taken seriously by mainstream science. Sometime’s it’s fun to explore not only what is known about science, but why things cannot be as others imagine them to be.
Phxmarker,
Deriving a value for the proton radius is interesting but does not solve the puzzle. The puzzle is the small (~4%) discrepancy in the value of the proton radius as measured by electrons (~.877 fm) compared to (~.841 fm) when measured by muons. Under the standard model there is no reason that there should be any difference. More experimental testing is already scheduled.
As to the value derived from the simple and elegant equation you give, it could be significant or it could just be a coincidence. I think the occurrence of coincidence is often underestimated. There are quantities that can be equal to many decimal places and yet not be equal. They are given the rather unflattering but accurate name of ‘High-Precision Frauds’.
Lest any of you smugly sit back and smile you may (I said may) find that some of your most cherished beliefs like the exact equivalence between inertial and gravitational mass, the very basis of General Relativity, turns out to be a ‘High-Precision Fraud’.
Such numerology (there are even more “astounding” examples) is interesting only if the agreement is really good, say, to 10 significant figures. Barrow and Tipler, in their famous book, actually calculate how likely such coincidences are depending on how complicated a formula is (how many terms, which powers, functions, etc allowed, and so on). Even then, one needs a theory to back it up.
In your case, your agreement is not particularly good (how many significant figures), it has no theory, and it doesn’t work for all particles. Unless you have a good reason why it should work for the proton but not for other particles, then why should anyone believe you.
Why the radius appears to be different in different experiments is another question.
It’s worse than this: the radius is almost certainly NOT the problem… so all these people, mainly crackpots, who are recalculating the radius in all sorts of fancy ways are missing the point completely. The radius is not what one measures; it is what one extracts after making a set of other assumptions. It is likely that one of those assumptions is wrong. http://profmattstrassler.com/2013/01/31/the-puzzle-of-the-proton-and-the-muon/
A famous example: the proton-to-electron mass ratio is very close to 6*pi**5. There was a refereed-journal paper on this in the 1950s, IIRC, which consisted of just this one sentence (or one similar to it). Unfortunately, I can’t find the reference now.
Suppose the multiplicative constant is a free parameter. And there is not just pi but also e. And the exponent could be anything. How many possible numbers could I thus produce (limiting the constants to, say, less than 10)? How many ratios are there? What is the probability that I match one of those to a given accuracy?
Again, for anything like this to be considered seriously at all, the match has to be really good and/or one needs a theory.
There is a deep theory behind MpRp=4LM, it is the work of Elizabeth Rauscher and Nassim Haramein – I thought that was obvious. A deep rich theory.
Yes, a deep, rich, and fundamentally misguided theory. That’s three strikes; three major conceptual errors. You’re out.
Matt, where is the conceptual error?
I am simply trying to help you get out of your confusion:
Matt Strassler | April 6, 2015 at 11:16 AM | Reply
I don’t know. That’s part of why this is so exciting; we’re very confused right now.
Thanks for update, Professor. I am very glad to hear your voice again.And my Special Thanks for your returning back to Public ServiceNot that many Great Professors nowadays do it for the common public. Eager to follow your blog again. Your fan, bob-2
Considering that the measured decay time of the Higgs is less than the theoretically predicted decay time, how will this be resolved? Via a new particle that acts as a catalyst or new physics or what?
Phxmarker – what are you talking about? The decay width (or time) of the Higgs has never been measured. The width is measured to be less than about 17 MeV, and the prediction is 4 MeV. Those aren’t inconsistent.
I meant the decay time of the Higgs to the quarks or gluons or whatever to the gamma rays that the detector detects. That measured total delay is less that what the theory predicts. The Standard Model is in Big Trouble.
Wrong again. I wish you were right.
So if it is not less, than it is more time?
Your first statement is wrong. The Higgs decay time has not been directly measured.
yes, you are correct, however, it is well known there are 4 detector and one of them is used to measure to detect the gamma rays after the Higgs decays… this is one one the main things to be looking for is: does the theory match the data for decay rates or reaction times…
Production rates or yield is one thing, decay time are another, and the theory is only as good as it’s predication capablity and how well it measures up against the data.
Professoer Leonard Susskind makes it very clear here, especially at 1:07:00.
https://youtu.be/JqNg819PiZY?t=1h7m
I like, at times, to be surprised, this is one of them.
Hi Matt,
Patience is a virtue, as they say. But very exciting times ahead. I do hope a paradigm shifting discovery is just around the corner. Which either way physics needs, and possibly humanity.
Looking forward to your blogs on the matter as always 🙂
Nice, and thanks! LHC is the veritable Particlebuster, can’t wait until it takes out the big guns and cross the beams. To have an eye witness helps appreciating the immense effort.
“though as usual in Europe, with rather too much supersymmetry and not enough of other approaches to the naturalness problem.”
If we put LHC aside for a while (sorry!), I am curious if theoreticians accept the new Fermi LAT observations of a smooth space at or above Planck scales any better than the earlier supernova photon timings & polarization data? Those there marginal, based on small number statistics, there were theoretical outs (such as that polarization constraints were based on supersymmetry), and it seems they were never popular.
“This marks the first time a study has successfully constrained the possible stochastic effects to below the Planck scale. … And further observations of other gamma ray bursts—both by the Fermi LAT and also the higher-energy and sensitivity Cherenkov Telescope Array—will be able to use the same technique to constrain the Planck scale effects down even further.”
[ http://arstechnica.com/science/2015/04/searching-for-a-quantum-foam-bubbling-through-the-universe/ ]
I am asking because Nima Arkani-Hamed posted a web seminar last year, where he described supersymmetry as a remaining possibility and a way to regulate quantum fluctuations of space.
As I remember it:
– Supersymmetry has a free “slot” because for some dimensional reason elementary particles can have spin 0, 1/2, 2/2, 3/2, 4/2 and no more. Standard particles takes slot 0, 1/2, 1, and gravity slot 2. Supersymmetry particles would take slot 3/2, or nature would waste an option.
– Small dimensions that supersymmetry would inhabit would allow small quantum fluctuations. Those would entangle (I think it was; as supersymmetry is putting particle dimensions on par with space dimensions?) with quantum fluctuations in space and constrain those to be small-ish too.
I have not yet studied these Fermi-LAT-based results, so I can’t comment intelligently at this point… except to say that complicated “space-time foam” need not leave any mark on gamma-ray burst data.
How can dark matter have anything to do with the LHC when dark matter is a placeholder for unknowns in the cosmological Einstein Field Equations?
I agree, this LHC machine is not the machine for this work.
phxmarker.blogspot.com
I wouldn’t use the term “placeholder”, it applies to mathematical equations but this is physics.
Matt has described above why dark matter is an observation. (At least as long as you agree with hypothesis testing, a basis for measurement theory.) It isn’t a very informative (e.g particles, but how massive?) or simple (e.g. tested as constraint rather than an exclusive experiment), but it isn’t an ‘unknown’ entity.
I think youre confusing this with dark energy
Thanks for your updates — I appreciate them. Good luck
Sent from my iPhone
>
“Well, stay tuned to this blog, and you’ll get a much more accurate story.”
That’s good to hear. To echo a comment above—glad you’re back!
This is about LHC restarting and adding new results. Can you answer these questions, please ? 1) Is it likely that a Higgs 2 particle will be found in this energy range, and only the Higgs 2 ? 2) There are NO experiments that show that there is Dark Matter, is that true ? Why, what experiments or theories leads one to believe there is Dark Matter. Arn’t there some 50 theories on gravity, all newer than General Relativity, (like MOND, MOG, etc) that no longer reguire Dark Matter, and thus imply that Dark Matter does not exist ? 3) And there are some 50 theories that are newer and do require Dark Matter, for a total of about 100 theories, does this imply that Geheral Relativity has problems, but still is the major theory for requiring Dark Matter ? 4) So looking for Dark Matter with LHC is somewhat of an after thought ?
1) It is impossible to place odds on such a thing.
2,3) There are a huge number of observations of the cosmos, of a very wide variety, that strongly suggest that dark matter exists. The new theories of gravity you refer to have many profound problems of their own; it’s not enough to do away with dark matter, you have to actually avoid conflict with other data and theoretical inconsistencies, which general relativity + dark matter succeeds in doing. And don’t make the mistake of counting theories: 1 good theory is worth a billion bad ones.
3) You don’t need the details of general relativity to see that there’s a need for dark matter or a modification of Newton’s laws. You just need to look at how stars move in galaxies, how galaxies move in clusters, and how gravitational lensing works. For most of this, Newtonian physics suffices, along with the simplest form of the Equivalence Principle. Meanwhile general relativity has no other known problems; it succeeded in predicting the rate at which a pair of orbiting neutron stars would lose energy, see http://en.wikipedia.org/wiki/PSR_B1913%2B16 . So no, it’s not about general relativity; and also, let’s not forget that alternatives to general relativity have to explain general relavity’s successes.
4) Not an afterthought. Just one of the things that’s worth doing, but not guaranteed to pay off. And in doing these measurements, you might discover something else along the way.
CERN has to take into account the Public interest, and this lies in Dark Matter.
So that is what dictates the selling strategy.
Did you think for a minute that we are going to concentrate in this direction only? This search for DM will be one of dozens of searches we will do. It simply sells well.
Eilam
Hi Eilam! Thanks for your message. No, I did not think for a minute that this is what you would concentrate on, nor did I say so; I have extremely high respect for my experimental colleagues, as you know.
But I am quite interested to read your explanation… that, in your opinion, the “dark matter” story stems from a conscious decision, by leading scientists, to sell the LHC in a somewhat inaccurate way, simply because the public is interested in dark matter.
In the era of blogs, selling a scientific machine inaccurately is guaranteed to backfire, because the truth comes out quickly and violently. So I hope your opinion isn’t true.
I see an analog situation in astrobiology, which is interesting to me (and hence cosmology and hence …). NASA’s Hubbard sold in “follow the water” and that has been immensely fruitful for focusing science, politics and the public interest.
Yes, some get annoyed by that Curiosity isn’t tooled to look for extant or extinct life. And some will do the same fruitless pattern search on images as one can see particle cranks do on energy levels and what not.
Without statistics it is hard to tell if the public interest is smaller than it ought to be. But it has gained momentum, from successful missions, exoplanet finds, and the commercialization of space. So being up front isn’t too damaging here.
I hope the scientists at CERN do not follow the failed path as NASA has in “selling” their programs to the public. I have worked on the manned space program for close to 20 years before I opened my eyes to the BS, NASA was selling to the public so they can keep their annual budget growing. The increasing thrust expelled from the ever upgrading and bigger rockets is very much what NASA has become, “hot air”. All because they feel the need to sell it to the public takes top priority. And look where they are now, in total disarray scrambling to prove to Congress, the American public and more importantly, themselves that they still have it.
Science and technology does not need Madison Avenue to bud in just good scientists and engineers. Good day.
I hope they find a particle which is truly staggering one which can only be representative of the vacuum energy of space and one which is constantly coming into existence. My follower in USA has made a prediction on my behalf last year.
So glad you’re back Matt! Nobody explains things like you despite the fact that there are other worthy Physics bloggers. My main interest is the naturalness problem which has bugged me ever since I first read about it. My gut reaction is that there has to be a better explanation than those which have been suggested so far. AND crossing finger and toes that LHC starts producing interesting again ASAP.
Thanks for the kind words!
“I second that emotion ….”
So if I was a betting man, where should I put my money on what particle will be discovered next.
A new type of Higgs ? ( or maybe I should just keep my money in the bank ).
I don’t know. That’s part of why this is so exciting; we’re very confused right now.
if you would like a little help out of that confusion, there are those of us willing to help…
The ego has landed.
Hi Matt.
Why it looks as if you doubt that the machine will work ?
Do you have reasons ?
What the LHC operators are doing is unprecedented; they haven’t driven the magnets so hard before. There can certainly be some initial problems. All it takes is one little problem that requires a fix involving warming up the machine, and we’re automatically looking at a two month delay. Also it was nine days after the initial start in 2008 that the little incident happened that set them back more than a year (though they made up a lot of the time later.) So let’s not forget that the magnets and the beams are extremely challenging to operate and control. When they are successful, the LHC operators deserve an enormous amount of respect and credit; and even if there are problems, one should remember the difficulty of the task.
a) A “machine” that big and so many components in such a large area, how long does it take to bring “it” up/down to the operating temperature?
b) What is/are the margins of the operating temperature?
c) And finally, do they really run it with stable temperatures or is that a disturbing limitation of the collider? In other words, are all collisions produced at the same conditions?
This depends on what part of the machine you’re talking about. (What temperature does an espresso machine work at, on average?)
The actual colliding part of the machine doesn’t really notice temperature, being a very hard vacuum. Most of the rest of the machine is around room temperature. The magnets themselves however need to be very cool to work and can take months to warm up or cool down not least because they use liquid helium, a persnickety gas if ever there was one. A brief summary and video can be found here: http://home.web.cern.ch/about/updates/2014/12/lhc-filling-liquid-helium-4-kelvin
Matt: A question. Can you give some idea of what is meant by magnets driven hard? Are they increasing current and number of coils from previous run?
Increasing current. Coils are unchanged.
The focus is on dark matter because it’s ‘sexy’ to the press. The politically aware people at CERN are aware of this and are happy to go along. The man on the street isn’t interested in the properties of the Higgs, that’s old news. Now if CERN can find hidden dimensions the money for the next collider is assured!