Of Particular Significance

AMS Presents Some First Results on Cosmic Rays and Dark Matter

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 04/03/2013

The Alpha Magnetic Spectrometer [AMS] finally reported its first scientific results today. AMS, a rather large particle physics detector attached to the International Space Station, is designed to study the very high-energy particles found flying around in outer space. These “cosmic rays” (as they are called, for historical reasons) have been under continuous study since their discovery a century ago, but they are still rather mysterious, and we continue to learn new things about them. They are known to be of various different types — commonly found objects such as photons, electrons, neutrinos, protons, and atomic nuclei, and less common ones like positrons (antiparticles of electrons) and anti-protons.  They are known to be produced by a variety of different processes. It is quite possible that some of these high-energy particles come from physical or astronomical processes, perhaps very exciting ones, that we have yet to discover. And AMS is one of a number of experiments designed to help us seek signs of these new phenomena.

The plan to build AMS was hatched in 1995, and the detector was finally launched, after various delays, in 2011, on a specially-ordered Space Shuttle mission. Today, Sam Ting, winner of the Nobel Prize for a co-discovery of the charm quark back in 1974, presented AMS’s first results — a first opportunity to justify all the time, effort and money that went into this project. And? The results look very nice, indicating the AMS experiment is working very well.  Yet the conclusions from the results so far are not very dramatic, and, in my opinion, have been significantly over-sold in the press. Despite what you may read, we are no closer to finding dark matter than we were last week. Any claims to the contrary are due to scientists spinning their results (and to reporters who are being spun).

The logic behind AMS is this. If dark matter is indeed made from particles, those particles, as they drift through space, will occasionally encounter one another. And when two dark matter particles collide, if they are of an appropriate type (for instance, if dark matter particles are their own anti-particles, as is also true of photons, gluons, Z particles and Higgs particles), they may annihilate into other, more familiar particles, including (but not limited to) electrons and positrons.  Assuming the dark matter particles are rather heavy, their large mass-energy [i.e. E = mc² energy] will get transformed in the annihilation process into large motion-energy of the lighter-weight, familiar particles. In other words, the familiar particles produced in the annihilation of dark matter particles will be high-energy cosmic rays, of the sort that AMS is designed to measure.

Thus annihilation — or, alternatively, if they can occasionally occur, decays — of dark matter particles can serve as a new source of high-energy particles, including perhaps positrons, and also perhaps anti-protons or even entire anti-nuclei.

To see a hint that this may be happening, the simplest trick is to count, in the cosmic rays, the number of electrons and positrons of a certain energy, and see what fraction of them are positrons. Naively, without a source like dark matter annihilation, one would expect this “positron fraction” to become gradually smaller at high energy. That is because electrons are abundant in the universe and are rather easily accelerated to high energy, whereas most positrons are expected to produced only as a by-product — when the high-energy electrons hit something out in space — which means the positrons would generally have lower energy than the electrons that were needed to produce them. But dark matter annihilation or decay would typically produce as many positrons as electrons, and with the same energies; and the typical energy would be comparable to or a bit lower than the mass-energy of the dark matter particles. So if dark matter annihilation or decay is occurring, then, at energies near to (but below) the mass-energy of dark matter particles, the positron fraction might begin to grow larger, instead of smaller, with increasing energy. 

Thus, an increasing positron fraction is a signal of dark matter particles annihilating or decaying into known particles. But it isn’t a smoking gun.  Why not?  Many reasons. There’s the potential for false negatives. Dark matter may be made of particles that don’t annihilate or decay. Dark matter may be made of a small number of very heavy particles, in which case annihilation may occur but may be too rare to contribute many cosmic rays. Or annihilation may not produce that many electrons and positrons compared to other particles that we would already have observed.  (In fact, the latter two issues typically are the case for many popular models of dark matter, such as those from simple variants of supersymmetry.)  Meanwhile, false positives are possible too: there might be astronomical processes in nature that we are unaware of that can also create high-energy positrons; pulsars have been suggested as a source. So when interpreting the results from AMS and similar experiments, appropriate caution and clarity of thought [not universally observed in today’s press articles] is always necessary.

The positron fraction was already examined by the PAMELA satellite, which, though smaller and less powerful, beat AMS into space by several years.  Like AMS, PAMELA can measure the energies of particles that enter the satellite, and can distinguish electrons from positrons.  Famously, PAMELA did indeed find a positron fraction that is growing, over energies of 10 – 100 GeV or so.  (Their data are the blue squares in Figure 1.) The FERMI satellite has unexpectedly and cleverly managed to measure this too, up to 200 GeV.  (Their data is the green triangles in Figure 1; note the large uncertainties given by the green band.)  So we already knew AMS would find such an increase!  We already were confident that there is an unknown source of positrons above 10 GeV.  What we wanted to know from AMS was whether the effect continues at even higher energy, well above 100-200 GeV, and whether their more detailed observations would give us insight into whether this increase is due to a new astronomical effect or a new particle physics phenomenon.

Fig. 1: AMS results (red dots) compared to previous results, of which the most important are PAMELA (blue squares) and FERMI (green triangles).  Within the stated errors, PAMELA, FERMI and AMS are essentially consistent; all show an increasing positron fraction above 10 GeV and as far up as 300 GeV or so.
Fig. 1: AMS results (red dots) for the positron fraction as a function of energy, compared to previous results, of which the most important are PAMELA (blue squares) and FERMI (green triangles). Within the stated errors, PAMELA, FERMI and AMS are essentially consistent; all show an increasing positron fraction above 10 GeV and as far up as 200 GeV or so.  Whether the increase continues above this point isn’t yet clear.

Well? What did AMS say?  Their data are the red dots in Figure 1.  What do they mean?

First, AMS confirms what PAMELA and FERMI observe, that the positron fraction is increasing for some reason (though amusingly AMS phrases this “confirmation” as a “discovery”).  The detail with which they make this measurement is very impressive!  Look at all the red dots and how small the uncertainties are compared to previous measurements! Unfortunately, this detail does not reveal anything striking; there are no interesting features in the data, which instead is rather smooth.   Perhaps the most interesting aspect is that the rate of increase of the positron fraction appears to be slowing down gradually as the energy increases. But this doesn’t have any obvious meaning, at least not yet.

Second, AMS is able to go a little higher in energy than PAMELA and FERMI. But not that much — only to 250-350 GeV — and not with enough data, at the moment, to really give us any insight as to what is going to happen to the positron fraction at higher energies. So we don’t really know much more about the high-energy behavior than we did before.

Third, AMS is working well and will be able, eventually, to give us more information. But it will take time. They both need to understand their experiment better and to collect more data. My guess? it will not be months, but years. Maybe 3. Maybe 5. Maybe 10. I don’t know.   Let’s hope the space station doesn’t have any glitches or serious difficulties over that time.

One problem for the higher-energy measurements that are yet to come is that the systematic uncertainties on the measurements are becoming larger.  This is due mainly to the difficulty of measuring the charge of the particles — which you absolutely have to measure if you are going to distinguish electrons from positrons. For instance, AMS reports that for positrons and electrons in the range of 206 – 260 GeV, the positron fraction is 15.3% with a statistical uncertainty of +- 1.6% and a systematic uncertainty (dominated by the charge measurement) of +-1.0% — which is about a 6% relative systematic uncertainty on the fraction itself. In the highest available range of 260-350 GeV shown today, the fraction is 15.5% with a statistical uncertainty of 2.0% and a systematic uncertainty of 1.5% — a 10% relative systematic uncertainty.  At still higher energies the relative uncertainty will get worse: a particle’s electric charge is determined by measuring which direction the particle’s trajectory bends in a magnetic field, but the higher the energy, the straighter the trajectory becomes, so the challenge of determining the bending direction becomes greater. Can AMS meet this challenge? And can they convince us that they have met this challenge? That will probably be the question for AMS in the coming decade.

A final point that I don’t yet understand in detail: AMS finds that the positrons seem to come equally from all directions. That’s somewhat important, in that astronomical causes of the growing positron fraction might not be distributed equally across the sky, in contrast to dark matter. But I think AMS’s result is still too crude to tell us much at the moment.

To conclude, a word of caution: no matter what AMS finds, unless it is hugely spectacular, it will not be easy to settle the controversy over the source of the positrons. Current consensus among experts is that it is very unlikely we will see a smoking-gun of dark matter from any experiment like AMS.  There’s nothing in today’s data, nor in the projection of that data into the future, that suggests we’re on the verge of a definitive discovery.  We can hope for a surprise, though.

(You can read a similar point of view at Resonaances.)

Share via:

Twitter
Facebook
LinkedIn
Reddit

144 Responses

  1. Hey there! I’m at work browsing your blog from my new iphone!
    Just wanted to say I love reading through your blog and look forward to all your posts!

    Keep up the excellent work!

  2. IF DARK MATTER IS EVERY WHERE, THEN THE FRICTION OF MATTER ENERGY CANNOT CREATE DARK MATTER ENERGY, SEEING THAT DARK MATTER IS COUNTLESS TIMES MORE THAN FRICTION ENERGY. DARK ENERGY GIVES LIFE TO FRICTION ENERGY! NONE DECAYS BUT ARE ABSORBED INTO THE OTHER BUT STILL HAS A RESIDUE OF ITS OWN. THE ANSWER TO THE UNIVERSE MYSTERY IS TO UNDERSTAND THE CONNECTION BETWEEN THE MIND AND THE BODY

  3. The original idea of an ether is that at the time, it was believed that waves needed a medium to travel through, just like sound waves need atoms to travel through (air, solids, etc.). We now know that sound is the consequence of particle-like properties of groups of atoms or molecules called phonons, like in gases, crystals, metals, etc.

    Michelson in 1881, and then Michelson and Morley in 1887, tried to detect the movement of the earth in relation to the ether (due to galilean relativity between the systems of references, earth and ether, with the ether being the medium through which light would travel), but there was no evidence of the existence of such a reference framework. Similar (more accurate) experiments have been conducted over the years since then, and the results are the same: there is no such medium.

    When Einstein completed his GR theory, one clear consequence of it is that space-time and the warping of its geometry by the content of momentum-energy in the universe define physical properties of the universe, so, he thought that it was ironic that in a sense, space-time was a way for the ether to come back trough another set of his ideas, but this is a different type of ether, as electromagnetic waves do not need a medium to travel through.

    So, even though Einstein did mention something he called ether in relationship to GR and space-time, he was thinking of something different from the original idea of ether.

    Kind regards, GEN

    1. “Michelson in 1881, and then Michelson and Morley in 1887, tried to detect the movement of the earth in relation to the ether (due to galilean relativity between the systems of references, earth and ether, with the ether being the medium through which light would travel), but there was no evidence of the existence of such a reference framework.”

      Not true- they measured approx. 4 km/s consistently. People argued about the statistical accuracy, but Dayton Miller’s follow-up work over the next 30-40 years clearly measured ~10 km/s, and he was never proven incorrect (regardless of the back-stabber Shankland).
      http://www.orgonelab.org/miller.htm

      “Similar (more accurate) experiments have been conducted over the years since then, and the results are the same: there is no such medium.”

      Similar, but not the same. They were all performed in high vacuum in thick sealed steel vessels. Galaev argues that this is not the correct mode to detect the ether. All experiments performed in the atmosphere did register responses.

      “…[Einstein] … thought that it was ironic that in a sense, space-time was a way for the ether to come back trough another set of his ideas, but this is a different type of ether, as electromagnetic waves do not need a medium to travel through…”

      As you can see, this is really not as established as you think. Jettisoning ether temporarily solved the Michelson-Morley results, but ironically, now Planck reinvigorates them (indirectly).

      1. the dark matter would be linked to the conservation of cp to strong interactions-would be hidden the violation of pt,the would permit the discreteness and continuity of spacetime,the does implies the appearing of the invariance of lorentz that gives us the symmetry of spacetime,but the asymmetry to the rotational invariance to spinors in transformations of 360 degrees-have there the the vestor fields u^2(x) that is prefered scheme-THEN THE APPEARING OF THE SECOND DIMENSION TO THE TIME THAT ARE THE AXIONS,THAT IS THE SUPERSYMMETRY TO THE VIOLATION OF CP-THEN THE SPACE and RIME TO RELATIVISTIC SPEEDS END ANNULING THE TIME DILATATION TIME AND CONTRACTION OF SPACETIME BY THE EXISTENCE OF REVERSION OF
        PT,NOT CPT

      2. i believe what the dark matter has intrisic relations with the aether and the axions thar are compatibles with the conservation of cp to strong interactions,that lead us to think that the time is splited by two opposite orientations(time with 2-dimensions) curving the space and generating the spacetime continuos.the conservations is given mathematically by the violation of pt,then apar the antisymmetric metric tensor that implies the future and past as two curvatures that if connect in the ” present”).both are non-local hidden variables

  4. The process of gathering and increasing/enhancing knowledge in factic sciences is usually and incremental one, as it grows over time.

    It was with Lavoisier’s meticulous measurements of masses in his chemical experiments that modern chemistry was born, and chemists were among the first modern scientists to accept the idea that atoms might exist as they are a simple and “round” explanation for chemical reactions and how the masses of elements that are involved on any given reaction are always multiples of small integers.

    Then, by the second half of the XIX century, it was Maxwell and Boltzmann that spearheaded the idea of statistical mechanics to explain macroscopic thermodynamic properties of gases. This new concept of statistical mechanics used for thermodynamic properties also worked perfect with the concept of atoms and molecules.

    By the 1870s, even though there was not way to prove that atoms and molecules really existed, the idea was able to be used by many different scientific theories that had very good predictions in very different contexts.

    Some scientists were strong advocates against the idea of atoms as ridiculous, as they were so small that there would not be a chance to get evidence of their existence. One very notable advocate is this idea was Ernst Mach.

    In fact, he proposed a very interesting principle that a theory should never be based on a concept that we can’t detect or measure its existence.

    This principle posed a strong influence on Einstein by 1905 when he proposed in Special Relativity that there was no ether since we could not detect it.

    Also in 1905, Einstein published another paper, the paper that explained brownian motion, were he presented a very compeling demonstration that atoms and molecules did exist, and even he correctly predicted the order of magnitude of the average length of molecules.

    At that point, the argument for the concept of atoms and molecules received a spectacular support, so, it soon became mainstream.

    Again, it is the incremental process of gathering traction, experimental evidence and the retrofit of that evidence into improved versions of a theory that turn them into mainstream.

    It is not about “politics” of science and “incumbent” theories winning over “runner-up” theories , it is not about the ideas proposed by “established authorities” winning over theories proposed by a pip-squeak challenger scientist: it is about any theory that best follows and applies Ockham’s Razor principle.

    Kind regards, GEN

    1. “This principle posed a strong influence on Einstein by 1905 when he proposed in Special Relativity that there was no ether since we could not detect it.”

      There is a fallacy here. Ether was detected, in fact. Michelson-Morley detected a velocity though the ether of ~4 km/s. Now, due to preconceived notions, they expected 30 km/s, so using statistical like arguments, they rejected the results. Dayton Miller for 40 more years consistently measured velocities in the ~10 km/s range using much more sophisticated equipment and techniques. Many other experiments could be interpreted as measuring ether- Sagnac and Michelson-Gale to name a couple.

      So “science” rejected a repeatable, verifiable measurement, not because they found cause to, but rather because the measurement did not fit the little in the little box they built for their ideas. This is EXACTLY what scientists like Mach were warning against (not that he was always correct).

    2. And as a follow up, after Einstein developed the general theory of relativity, he said ether did exist; it just was not “ponderable”.

  5. Dark Matter has unusual and interesting properties:

    1. It keeps the Big Bang theory viable (well, up until the Planck results verified that the Copernican Principle is invalid).
    2. It keeps $billions flowing big bang and related research.
    3. It is much like Einstein’s GRT ether: “non-ponderable”
    4. It is a collective figment of imagination.
    5. It fits in Mendelev’s periodic table in period -1, right between cosmic cotton candy and mystical magical gumdrops.

    1. In point 1 you are excited about a misinterpretation of Planck’s results, and in point 2 you complain about all the money that was spent on research like Planck. You might try consistency; it might help your thinking.

      1. So there is no issue with the large scale anomalies (read correlation of the largest structures in the visible universe) to our ecliptic and equinoxes (i.e., the universe on its larges scale is pointing directly back to little old, insignificant us)? No issues with a preferred direction in space (independently verified outside of the CMB data, i.e., galaxy spins, polarization, etc.).?

        I am not against research. But only one idea, now practically discredited (especially given that Planck was to referee WMAP, and indeed verified its findings), gets funding.

        1. In both cases, it is not observable (aether or dark matter), and a theory (classical cosmology or big bang cosmology) depends on its hypothesized existence to survive.

        2. In a sense; though the neutrino has been observed, and germanium also has been observed, and can be probed, broken into smaller units, transmuted, etc..

          1. Indeed, but for quite some time thy were not. And it may well be that we are in that ‘unobserved but predicted’ stage for dark matter, that in decades to come we will be able to create and manipulate dark matter particles much as we do today with neutrinos.

            Ether was wrong, neutrinos right we can be pretty sure of that now, but in both cases there was a period of time when it could have been either way. I would have been quite within reason to scoff at the neutrino as an excuse to explain the missing energy in beta decay. A particle that props up the theory that has no charge and no mass and which nobody had yet detected? Very convenient.

          2. Germanium is not a particle, is an element of the periodic table, chemically similar to Carbon (same number of valence or outer layer electrons, s2 p2).

            Its existance was predicted by Mendeleev more than 15 years before it was discovered, and his predictions of various physical and chemical properties of Germanium, including its correct place in the periodic table were very accurate.

            With his deduction of the periodic laws and the proposal of the periodic table, Mendeleev is clearly a major precursor of Quantum Mechanics, and his ideas and predictions gave a lot of insight to the first quantum models.

            Kind regards, GEN

          3. “Indeed, but for quite some time thy were not. And it may well be that we are in that ‘unobserved but predicted’ stage for dark matter, that in decades to come we will be able to create and manipulate dark matter particles much as we do today with neutrinos.”

            If dark matter exists. Germanium was predicted because there was a logical, predictable, and reproducible series of elements already discovered with an obvious hole where Germanium fit. Similarly so for the neutrino.

            Germanium fit within the developing experimental and theoretical field of chemistry, and the neutrino in particle physics. When people start predicting “dark matter and energy” based on extensions of thought experiments into the observable universe, this is totally another thing. The comparison is not valid. Thre is only one universe (some would say one multiverse), and it cannot be tested, probed, categorized (other than theoretically), etc.

            “Ether was wrong, neutrinos right we can be pretty sure of that now, but in both cases there was a period of time when it could have been either way.”

            I have to disagree that ether was “wrong”. This has not been established. First, quantum mechanics tell us that space is anything but an empty vacuum. The vacuum has structure, and that structure is rightfully called “ether”, whether you want to call it quantum foam, a plenum, whatever. A vacuum has a density of 0, shile the Planck particle has a theoretical density of 10^93 g/cc. Similar to 120 orders of magnitude discrepancy in “dark energy”. Second, tell me that dark energy and matter are not the New Ether (TM). Finally, it is true that science jettisoned the ether at the beginning of the 20th century, but not for good reason. Only because of the failure of Michelson-Morley, and confusion from a series of previous experiments in which the results did not match expectations. Note that I did not say that the results did not make sense, nor that the results were wrong. In Michelson Morley, a velocity through the presumed ether of ~ 4 km/s was measured, while 30 km/s was expected. Dayton Miller, expanded on MM work by an order of magnitude or more in precision and understanding and consistently registered ~10 km/s. Dayton Miller at Case University has been relegated to a guy who x-rayed his arm (go to the physics building lobby at Chase. Next to a mock-up of the MM is a picture of DM x-raying his arm). So, science jettisoned ether, just as quickly as they accepted Copernicus in the 16th century, without really having laid a scientific foundation to justify the change. Since then substantial effort has been expended in scientific research with Einstein’s relativity as its basis, but no such effort went into making the decision to jettison ether- only, EXPECTATIONS were not met, so scientific theory was reconfigured.

            ” I would have been quite within reason to scoff at the neutrino as an excuse to explain the missing energy in beta decay. A particle that props up the theory that has no charge and no mass and which nobody had yet detected? Very convenient.”

            Not really, as described above. You had a valid, empirical, reproducible experimental framework with verifiable non-tautological (i,e,, the Michelson-Morley arms shrunk just enough to mask the measurement) theoretical basis to predict the neutrino. Dark Matter and energy are just two of the fudge factors added to the “evolution” of the big bang theory.

            1. You state, indirectly, that dark matter and energy are extensions of thought experiments,I fail to see how the fact that large structures in our universe do not contain enough matter to hold themselves together is simply a thought experiment; nor the fact that the universe’s expansion appears to be accelerating. There are ‘holes’ in our understanding that we have proposed solutions for and the basic logic is quite sound. As we gather more data the ‘hole’ becomes ever more narrowly defined, one day we will perhaps be in a situation where we know exactly what must fit it with no other options possible.

              For this reason I would not call them the ‘new ether’ since we have seen effects. If ether had been proposed because light was seen to move at a relative speed, THEN the situations would be equal (And if ether were wrong, SOMETHING would have to explain the light results.)

              The key problem with calling vacuum ‘ether’ is that a key property of the ether was it was a material through which light traveled with a relative speed. The discovery that light travels at the same speed invalidated that theory. Calling the vacuum ‘ether’ would be like calling oxygen phlogiston, in a certain light they seem similar, but they are not the same thing.

              What we see with dark matter and energy are not ‘fudge factors’, what we have recently seen is that there IS a reproducible result, whatever large scale structure we look at in the universe we see the SAME signature of ‘missing matter’, the increasing expansion of the universe is certainly observable and various experiments agree. That’s how we know it is not simply a mistake or an error.

              Possibly you do not consider this reproducible because we cannot test another universe, we cannot run the universal experiment over again. But in that case, what is left of science? Archeology, history, anything with evolution, most of astronomy, is out. Anything where you cannot put your subject matter into a test tube. I have only ever seen that criteria used by people who wish to discredit a result as ‘only a theory’ and I do not accept it.

          4. I said, “Thre is only one universe (some would say one multiverse), and it cannot be tested, probed, categorized (other than theoretically), etc.”

            I meant more like- there is only one universe. It cannot be compared to others, destroyed and recreated, etc., other than theoretically. Obviously we can probe the one universe.

          5. “You state, indirectly, that dark matter and energy are extensions of thought experiments,…”

            I was referring to the development of GRT through the use of thought experiments.

            “I fail to see how the fact that large structures in our universe do not contain enough matter to hold themselves together is simply a thought experiment; nor the fact that the universe’s expansion appears to be accelerating. ”

            The expansion of the universe, accelerating or not is premised on red shift being expansion. If red shift is expansion, this is the correct interpretation, but we only believe res shift is expansion. There are other explanations for red shift. We choose this interpretation mainly due to philosophical reasons, not scientific ones.

            “There are ‘holes’ in our understanding that we have proposed solutions for and the basic logic is quite sound. As we gather more data the ‘hole’ becomes ever more narrowly defined, one day we will perhaps be in a situation where we know exactly what must fit it with no other options possible.”

            We only fill holes within one chosen philosophical and theoretical framework, and usually this leads to more holes that require filling, so the framework is expanded. My contention is that the evidence for the Copernican and cosmological principles being invalidated is very strong, and the time has come to change the philosophical and theoretical framework. This will change the manner in which we “fill holes”.

            For this reason I would not call them the ‘new ether’ since we have seen effects. If ether had been proposed because light was seen to move at a relative speed, THEN the situations would be equal (And if ether were wrong, SOMETHING would have to explain the light results.)”

            First, I call it the “new aether”, as an analogy, to the original aether in that dark matter/energy are unseen and assumed to exist, and are required for the current framework to work. I also contend that jettisoning the aether in the early 20th century was premature, especially in light of what QM teaches us (space is not empty), plus the fact that MM and later Dayton Miller, and others have measured signals in the aether, just not the ones the current philosophical and theoretical framework expects.

            “The key problem with calling vacuum ‘ether’ is that a key property of the ether was it was a material through which light traveled with a relative speed. The discovery that light travels at the same speed invalidated that theory.”

            That is not a discovery, it is one of the assumptions of special relatihvity, and not even a requirement of general relativity.

            ” Calling the vacuum ‘ether’ would be like calling oxygen phlogiston, in a certain light they seem similar, but they are not the same thing.”

            I agree to some extent, but as explained above, not completely.

            “What we see with dark matter and energy are not ‘fudge factors’, whath we have recently seen is that there IS a reproducible result, whatever large scale structure we look at in the universe we see the SAME signature of ‘missing matter’, the increasing expansion of the universe is certainly observable and various experiments agree. That’s how we know it is not simply a mistake or an error.”

            No. We see the holes in the current philosophical and theoretical framework. We know what we want to see, but don’t, so invent fudge factors to make it so. Just lioke MM- they wanted to see 30 km/s, but saw 4 km/s, so the aether was jettisoned.

            “Possibly you do not consider this reproducible because we cannot test another universe, we cannot run the universal experiment over again. But in that case, what is left of science? Archeology, history, anything with evolution, most of astronomy, is out. Anything where you cannot put your subject matter into a test tube. I have only ever seen that criteria used by people who wish to discredit a result as ‘only a theory’ and I do not accept it.”

            Agreed. Cosmology, evolutionary theory, etc. are not in the same class of empirical science as chemistry, particle physics, etc., because we cannot make new universes (except mathematically), etc. It does not mean that the science is out, but it is more limited.

  6. Why is the elementary particle rest mass spectrum discrete, rather than continuous? Specifically why aren’t there electronlike particles with the same electronic charge as the electron, but with slightly more or slightly less rest mass than the electron or half the rest mass of the electron or three times the rest mass of the electron? Ed Molishever

  7. Carlmott, Tienzen,

    I agree with the fact that it is all about symmetries and symmetry breakdowns.

    Just to name a very simple case of symmetry breakdown that has macroscopic consequences (so, we can experience it on a daily basis) but is completely quantum based: Entropy and time reversal.

    From a purely mathematical standpoint, we can verify that many equations that describe nature present a time symmetry (the variable time as a generalized dimension supports continuous translations) and the corresponding conservation law is the conservation of energy.

    But we know that our universe behaves in such a way that time is clearly directional (it only points towards the future, and does not allow “moving backwards”), which means that the symmetry is actually broken, and this is directly related to Entropy, non conservative fields and forces operating at the heart of real physical processes, that happen to be irreversible processes because of the net effect of non conservative fields.

    Energy is neither created (no new energy is created) nor destroyed, but the universe evolves from a more ordered state to a less ordered state, and in the less ordered state, energy is not as available or equally usefull as it was in a more ordered state.

    The evolution towards a less ordered state can by directly traced to the expansion of our universe: expansion == monotonically increasing volume == monotonically increasing number of statistical mechanics’ cells == monotonically increasing number of possible microscopic configurations that present the same macroscopic outcome.

    Kind regards, GEN

    1. Gastón E. Nusimovich: “Energy is neither created (no new energy is created) nor destroyed, … .”

      Your statement is so-called “established” knowledge which is, indeed, a gadget truth, supported by all known gadget data. But, it needs not to be the final truth. A gadget truth not only is improving our knowledge on the one hand but is also a limitation for our chance to know the whole truth which is often residing beyond our gadget capability.

      The directionality of time is again a gadget truth, including our “sense” gadget (the brain). But theoretically, time must have the “highest” symmetry which goes way beyond the simply symmetry of (past and future). However, discussing this time-symmetry will get into a new particle theory, and thus I will not do it here. But, I can talk about the entropy directionality which is much simpler than the time’s. Entropy was originally defined in Thermodynamics, mainly with a mathematical construct and is a function of temperature. It is then associated with order/disorder. In my view, entropy is the accounting of “action”. When we move A to point 1, it is action one. When we move A back to the original spot, it is action two. While the physical state is not changed, the action count has going from 0 to 2. As soon as you “act”, the count increases, that is, the entropy is forever increasing.

      The major challenge that physics faces today is not physics per se but is about physics epistemology. What have we chosen to do? I can tell a story here about this AMS data.

      The sugar daddy (Standard model) has twins (matter / anti-matter), having equal status and equal rights. Yet, the baryongenesis (a mysterious edict from the Almighty) expelled the anti-one from the Garden of Eden to somewhere (not knowing where). Then, we discovered a symmetry partner of the sugar daddy, the SUNY which resides in total darkness (commonly known as the hell). With the deep hatred to themselves, the ghosts in hell always self-annihilate. Then, AMS in space above the Earth caught some tails of those anti-ones.

      The above is the precise description of the current status of the mainstream physics. Yet, Pope will say, “I told you so. It was written, and it is so.” The modern physics simply plagiarized the old Bible story.

      But, I have a better story. The sugar daddy (Standard model) has twins (matter / anti-matter), having equal status and equal rights before the birth. They are given the dominion of the Garden of Eden. One brother (matter) takes the throne; the other (anti-matter) is the servant. They two form a chip-ball symmetry.

      Now, the battle line has drawn; the plagiarized Bible story vs the King/servant one.

      After the CERN news release fiasco, we can relax a bit with these two stories.

    2. But don’t closed systems of a fixed volume also experience an increase in entropy? So how is entropy tied to the expansion of our universe?

      1. Defining the entropy as the counts on action is done by me. As far as I know, it is not shared by the establishment.

        With this definition, only a dead system (without any action at all) will have zero entropy. Otherwise, entropy is always on the increase regardless of a system is open or closed.

        Yet, an order system will have fewer actions than the chaotic one. Thus, when a system goes to the direction of order, its new actions will be lesser than before. That is, the rate of new action is in the decrease. So, we often say that a system gets more orderly has a decreasing entropy, but the fact is that only the rate of increasing is decreasing. The total entropy is always increasing.

        Most importantly, the directionality of entropy is conceptually differently from the time’s.

      2. Entropy is defined for a closed system. The universe as a whole is a closed system. Entropy increases in a closed system when any irreversible process happens within that system.

        Tienzen, regarding the count of action as a measure of entropy, your example of increased entropy after a round trip from A to B and then, from B to A needs a non conservative field for it to make sense, as the total work of the non conservative force (related to the non conservative field) in that round trip accounts for the decrease in total energy of the system.

        If the same system is in a conservative field with only conservative forces acting upon, the net work of the forces on the round trip will be zero.

        If we apply Gauss’s Theorem to each one of these two systems (one with non conservative fields, the other with conservative fields) it will become pretty clear and transparent what’s the net result of non conservative fields acting upon a system of bodies, and quoting from Wikipedia “Intuitively, it states that the sum of all sources minus the sum of all sinks gives the net flow out of a region”.

        Non conservative vector fields operate as sinks, and non conservative vector fields are a consequence of entropy.

        1. It is interesting how this analysis, that pertains mostly to classical non relativistic mechanics in a way is related to how Emmy Noether ended up working on her theorems.

          In 1915, Einstein went to Gottingen to give a few lectures there on his (yet not completed) work on General Relativity.

          The honchos at that time at Gottingen were Hilbert and Klein, but Emmy was there too, already somebody to be reckoned with.

          Hilbert realized the math problems that Einstein had with his differential geometry equations, and after Einstein left back to Berlin, Hilbert asked Emmy to figure out those issues and solve them, which she did, under Hilbert’s tutelage (at the time, she was working there without either a post or pay, because there was no precedent for a female faculty member in Germany).

          One of the intriguing conclusions that emerged from that analysis by Emmy was the disconcerting fact that in the GR equations, total mechanic energy is not conserved as it is in classical Newtonian mechanics (and its derivatives, Langrange’s generalization, Hamilton’s generalization, etc.).

          Emmy understood that this intriguing conclusion needed to be further analyzed, and so, she did. One consequence of this further analysis are her theorems, which permeate from her PhD paper.

          Kind regards, GEN

          1. i think that the concept of violation of cp is intrinsically linked by the spacetime realatives,that are measured by the time dilatation and contraction of space.
            i think that the proper speed of light is derived CP or PT symmetry breaking
            that lead us to invariance of the transformation of lorentz,that does the spacetime be variables,then c might appear as constant and therefore the speed of light is the limit to the universe.

  8. This article made me go back and take another look at the item on the possible gamma ray excess from near the galactic core that Matt linked to in the comments. The 130 GeV figure is remarkably close to the mass of the probably-Higgs boson. I’m no physicist; might this be more than a coincidence?

  9. Hi Matt. I looked at the original AMS paper because I just didn’t understand how the experimental data indicates annihilation or decay of dark matter. Now the most important parameter in their model is Es which appears as exp(-E/Es) and represents part of the source term. Now look at the estimated value for 1/Es: 1/Es=0.0013 +- 0.0007. It is not even outside 2 sigma from zero! For a completely unknown phenomenon! This is not a two sigma result as for the Higgs decaying to some particles where you already expect a specific result. How can they publish a less than 2 sigma result for an unknown and unlikely mechanism and speculate about it? Am I missing something here? As always thanks in advance!

    1. in fact, they did not publish or claim any N-sigma result. rather, they published DATA on positron fraction vs energy. a measurement!

      1. From the CERN site: “Samuel Ting. “Over the coming months, AMS will be able to tell us conclusively whether these positrons are a signal for dark matter, or whether they have some other origin.”

      2. and from the conclusion of the paper: “The agreement between the data and the model shows that the positron fraction spectrum is consistent with e+- fluxes, each of which is the sum of its diffuse spectrum and a single common power law source.”

        1. Yes: the “standard” positron fraction (from the ratio of “diffuse” spectra) is expected to decrease with energy… while the presence of an extra (= additional) source of e+- can explain the observed trend. That’s obvious.

          1. Well, what doesn’t seem so obvious in my opinion (given the less than two sigma result) is the “single common power law source”

          2. To check I made a plot of the positron fraction with and without the exp(-E/Es) and indeed it made little difference, in correspondence with the large error on Es. I assume that this exponential is essential for the dark matter hypothesis. Maybe here I’m wrong?

          3. an exp-like cut-off is not necessary DM: it is just the simplest way to parametrize the maximum energy provided by a unknown source of e+e-. for astrophysical sources, exponential seems to be the case. for DM, the signal may drop even much sharply than exp(-E/Es).

          4. Hello Tom. “for DM, the signal may drop even much sharply than exp(-E/Es)” that’s what I thought too.Thanks for your clarification!

  10. Is there any chance at all that we can prove the existence of Dark Matter ?
    I am thinking : is it like Extra dimensions ?
    Pakri

    1. This is part of what the AMS is doing. The problem is that it and other searches are working by narrowing the field, reducing the number of options. What many wish for is a single, unambiguous discovery that immediately identifies one alternative.

      For DM something like observing a DM particle would do the trick, which would likely be an earth-based effort using a particle accelerator. Otherwise, we just have to be patient.

  11. Could it be argued that so far, the most likely sources for positron excess are DM self-annihilation and pulsar emissions, but that we still can’t rule out some other not so well understood source?

    1. That is the way of most things in science; there’s always the possibility of a totally unexpected phenomenon being responsible for the results. Sometimes this possibility has even proved to be the case. But usually the focus is on the most likely culprits until something rules them out. As it has been said, “When you rule out the impossible, whatever remains, however improbable, is the truth.”

  12. Thus far, most of physics blogs centered the AMS data on the issue of the “source” of the positron excess. Yet, only two sources are mentioned, the dark matter self-annihilation or the pulsars emissions, that is, must be one of the two, and no other choice. Furthermore, if the future data confirms that it comes from the dark matter, then the dark matter is “discovered”, even knowing its name already.

    By all means, dark matter is already “discovered” in many gadget data; the galaxy rotation curve plots and the Planck’s CMB data, etc.. The AMS data plot does not have any higher value than those other plots on discovering the dark matter. In fact, this first AMS data has absolutely nothing to do with the dark matter as the high energy data was missing this time around.

    However, this AMS data is very important; as it shows that the “anti-matter (not dark matter)” is not just a theoretical necessity for particle theory but is, in fact, a constituent part of this real universe which looks like is dominated with matter only. The asymmetry of anti-matter is not a “historical” event (as only a trigger) but is a constituent part of now. That is, the matter/anti-matter symmetry breaking is not just a “process” but is a permanent “framework”. The difference between the two (process vs framework) is very significant. Breaking a glass by an impact (similar to the SSB) is a process. Glass is always broken is a framework.

    In a symmetry framework, the symmetry partners need not symmetrical (equal in size or in number). A ball has 3-dimensional symmetry. When a small chunk chipped off, that very small chunk is the symmetry partner of the ball, as they two form the ball-symmetry. This chip-ball symmetry model is what the AMS data is all about. Thus, a correct particle theory must innately encompass this chip-ball symmetry model (innately chip-asymmetry). For dark matter, the Planck data should take the precedence.

    1. I would note that we know anti matter of a sort makes up a good portion of our universe; protons and neutrons contain a soup of quarks and antiquarks (and gluons), constantly interconverting and making up most of a nucleon’s mass.

      1. The anti-matter in nucleon is countered (or balanced) with matter, and there is no “net” existence of the anti-matter in nucleon, that is, the matter/anti-matter is in a balanced symmetry. On the other hand, AMS data shows the “free” anti-matter is a constituent part of the universe, which means the matter/anti-matter asymmetry is an innate part of the nature. Thus, a particle theory, which views the matter/anti-matter only in a balanced symmetry while the baryongenesis is only a historical event, cannot be correct according to this AMS data.

        1. I thought that the positrons that were being detected by the AMS were created in pairs with electrons, either via high energy particle collisions or via the decay of dark matter particles. As far as I am aware it does not indicate that this ‘free’ antimatter composes any sort of percentage of the mass in the universe or is created in any way that is not balanced by normal matter.

          1. There are many paths that create anti-matter, and even though the creation of a pair particle-antiparticle is the reason in many situations, there are other paths where the anti-particle is created because of a conservation law (like say, the conservation of electric charge, just to give an example).

          2. I meant to say: anti-matter is always created because of a conservation law, and there are many conservation laws to consider, so, the path that created a pair particle-antiparticle is not the only kind of conservation law that may apply when anti-matter is created.

            My original comment was somehow wrong in the sense that it could imply that the creation of a pair particle-antiparticle is not related to a conservation law.

            Following Emmy Noether’s theorems, for any dimension in a generalized space (space, time, rotation) that is symmetric (that supports continuous translations), there is specific a conservation law related to that symmetry, and vice versa.

            These theorems are deduced from to the equations that generalize the laws of motion (originally, Lagrangians) from Newtonian mechanics.

            Even though these theorems apply certain technical math tools to equations, what they really imply and ultimately proof is that it is nature that behaves this way.

            Something similar could be said about spontaneous symmetry breaking: even though there is a lot of math, it is nature that behaves that way.

            1. there are strongs evidences about of violation of symmetry cp,to several types of experiences with mesons and quarks.there is strong evidences of
              of violation of Simmetry for reversal time(T).then the universe to obey the law that exist two torsion:one to left-handed other tpo right this given in the connection of space and time in spacetime continuos.then the antiparticles are generated by the violation of PT that would in the asymmetry of matter and to antiparticles as product of particles
              then antimatter could exist in the universe in the relativistic time.

            2. Very true, however these are quite rare. To produce a positron for example the most common route to positron production is proton decay (Either during fusion or radioactive decay.) In which case, there already exists an electron ‘paired’ with that proton (There is no discernible excess of protons or electrons in the universe.) As far as I am aware there is no natural process that would allow the creation of significant amount of antimatter ‘free’ from normal matter.

              The conservation laws to which you refer have resulted (at least I assume) in our universe being rather ‘balanced’ on the whole, without an excess of electric charge, or color and so on.

          3. Kudzu: “As far as I am aware it does not indicate that this ‘free’ antimatter composes any sort of percentage of the mass in the universe or is created in any way that is not balanced by normal matter.”

            Gastón E. Nusimovich: “I meant to say: anti-matter is always created because of a conservation law, …”

            carlmott5520: “… then the antiparticles are generated by the violation of PT that would in the asymmetry of matter and to antiparticles as product of particles …”

            You all are correct.
            Conservation law is about symmetry. The entire issue is about symmetry. With a symmetry, there must be a symmetry breaking. The issue is what the top-most symmetry is for nature and how that symmetry is broken.

            In the current paradigm, it consists of the following.
            a. Standard model — a ball-like symmetry, the symmetry partners are equal.
            b. Baryongenesis — result of a mysterious symmetry breaking.
            c. SUSY (with s-particle) — a symmetry partner of the SM ball, with the mission to account for the dark matter. The self-annihilation of that dark matter accounts for the AMS data.

            This is a story of one-ball-like symmetry (with highest degree of symmetry) becoming a dumbbell (with a finite number of degree of symmetry). Then, one end of this dumbbell chips off to break this dumbbell symmetry further. This is a good bedtime story; complicated, boring and dragging. SUSY is not a higher symmetry but is a symmetry killer.

            In fact, a cliff at the high end (on the right) in the AMS does not give any support for SUSY as many other theories are much better candidates for the mark matter. Any dark matter theory must account for the Planck data.

            Thus, if a particle theory encompasses a chip-ball-like symmetry, it will be a much better theory for the AMS data.

  13. ” At still higher energies the relative uncertainty will get worse: a particle’s electric charge is determined by measuring which direction the particle’s trajectory bends in a magnetic field, but the higher the energy, the straighter the trajectory becomes, so the challenge of determining the bending direction becomes greater. Can AMS meet this challenge? ”

    This may be incorrect. Lubos wrote:
    ( http://motls.blogspot.com/2013/04/ams-steep-drop-is-there.html#comment-853217650 )
    “Also, could you be more specific about the reasons why you say that AMS can’t distinguish positrons from electrons at high energies? Didn’t you confuse it with Fermi (which was designed to detect neutral particles) that has to use magnetic fields? AMS has ECAL that seems perfectly built to distinguish them, see http://www.ams02.org/what-is-ams/tecnology/ecal/

    1. I (the particle physicist) am correct, Lubos (the string theorist) is confused. Are you surprised?

      An ECAL is designed to cause an electron or positron or photon to initiate an electromagnetic shower, whereby all its energy gets converted into a shower of many low-energy electrons, positrons and photons, which then can cause a charge pulse proportional to the energy of the incoming particle. The shower does not have any measurable dependence on the electron or positron’s charge; indeed even a photon’s shower has almost the same shape. So the ECAL cannot distinguish electrons from positrons; they have exactly the same behavior.

      To measure the charge, you use the tracker and a magnetic field, because positively and negatively charged objects bend in opposite directions.

      What the ECAL allows you to do is separate electrons and positrons from anti-protons and protons, because protons and anti-protons shower in an ECAL much later and less efficiently. If you read the AMS description of the ECAL carefully, that’s what they’re saying. They’re assuming, in that description, that you’ve already used the magnetic field and the tracker to measure the particle’s charge, so electrons and anti-protons have already been distinguished from positively charged positrons and protons before the particle arrives at the ECAL.

      This is what AMS writes (less clearly than they might have, but clearly consistent with what I’ve just told you):

      “Since a high-energy positron could have the same rigidity of a low-energy proton, they cannot be separated by a magnetic field. Positrons are very rare components of cosmic-rays. They have an abundance of 1 every 100,000 protons. We need an efficient way of separating positrons from protons. A similar argument is valid for the negative sign particles, as electrons and antiprotons. Indeed the antiprotons natural abundance is 1 every 100 electrons. ECAL is a specialized detector able to distinguish among positrons/protons and electrons/antiprotons with an identification power of one positron over 100,000 protons. ”

      (Boldface inserted by me.) Notice that the identification power is in terms of positrons per protons, not positrons per electrons… because the ECAL is not (and cannot be) used to distinguish electrons from positrons.

      1. Ok, thanks for clearing that up. It sounded like an easy mistake to make for someone not directly involved in the experiment so I didn’t realize your vs Lubos’ qualifications were relevant 😉

  14. Now what about that contradiction between SPT and PLANK results ?
    When SMoC people are going to entertain the possibility that paradigm shift is needed ?
    How ESA people dared to decide that the cosmos IS xBARYONIC , yDM , zDE as if a final fact while Planck DOES NOT prove beyond a shred of doubt that this is the case , this is mere theory laden interpretation……not final fact ?
    How can we trust professor X over professor Z while we need to know the facts beforehand ?

  15. Dear Matt Strassler,

    Please excuse me for I am by my own admission a science novice (but secret observer & enthusiast) could you or some other kind person take pity on my lack of knowledge and explain to me one thing!

    When you mentioned earlier that: ‘Perhaps the most interesting aspect is that the rate of increase of the positron fraction appears to be slowing down gradually as the energy increases. But this doesn’t have any obvious meaning, at least not yet.’

    Could this relate to the fact that:

    ‘Supersymmetry / Supersymmetric theories predict a cut-off at higher energies above the mass range of dark matter particles, and this has not yet been observed’ (http://www.astronomynow.com/news/n1304/03darkmatter/#.UV3jHcVwZdg)

    Please excuse me if I am completely wrong 🙂

    Kindly,

    Kit

    1. if the matter wasn’t asymmetrics,and that the antiparticles were not substractum of the matter,and total mass would be equal the zero.only would exit energy-due the simple annihilation of matter and antimatter.therefor existem major quantity of particles than antiparticles-that are bundleled strong and locally energy.then theirs are “accidents” of spacetime through of violation of cp with time with ” two dimension’ curving the space in 4-dimensional manifolds.
      in 4-dimension manifolds descoveried by s donaldson implies richness of metrics and toplogical structures that manifolds largest than 4.
      there are exotics structures not totally smooth in the 4-dimensional manifolds due the existence of dimension time deforming the the spacetime structures.

        1. This a good question: it is bound fermions that have anti symmetric wave functions.

          What about free fermions, that is fermions not bound by a potential? do they have an antisymmetric wave function?

          Besides, as I remember from college, antisymmetric (fermions) or symmetric (bosons) wave functions appear with pairs of particles of the same kind in problems of complete exchange.

          ? ! ?

          Kind regards, GEN

          1. Any two fermions of identical type have to be antisymmetric when you exchange them. It doesn’t matter whether they are bound to each other or inside the same bound state.

          2. Prof Strassler,

            Thanks a lot for answering my question: if I’m not mistaken, the antisymmetric “tendency” on fermions is mainly dictated by the half-integer value of the spin that they have and the “exchange” condition, that means that they have to be close enough to each other, and have the same values for the other quantum numbers, that is, n, m and l.

  16. Re.: the “spin” of the first AMS results announcement: Please allow me a completely non-scientific, uninformed, personal opinionated “spin” statement of my own:
    You have an experiment that has cost over $2B, that was years late, and that even so had to have its main, crucial super-conducting magnet replaced at almost the last minute by a non-superconducting one; that literally required an act of Congress to get it launched, to a low-earth orbit (when something much farther from Earth would surely be preferable), to be attached to a space station that you don’t control. Now you have to have this experiment operate for many years longer than originally planned (up to 10!), to be able to collect sufficient data with sufficient precision to produce meaningful results. In this day of sequestration and budget cutting, you better make the powers that control the purse strings think they are getting, and will contnue to get, something for their money. You better make your early results sound as interesting and enticing as you possibly can, and worthy of follow-on studies, while not going so far overboard as to alienate your science colleagues so much that they deride your conclusions (which could also endanger further funding.) Talking about “hints” of supersymmetry and dark matter that nobody else has found does the trick nicely, even if the data so far does not fully justify it. Now we just have to wait and see if anything actually comes out of all this. In this case, “Patience” requires continued funding, of both AMS and the ISS.

    1. 1.I don’t understand, on scientific ground, why something farther than low-earth-orbit should be preferable. 2.The AMS mission duration depends only on ISS lifetime: in principle, even >20 years. 3. The mission is already funded, you don’t have to present results for getting more money (if any, you may want to justify the money already spent). 4. To make results sounding interesting/exciting is a common practice in science: who doesn’t??
      In my opinion, the relevant point, from this data, is that the AMS experiment is working well, as expected, and exceeds in sensitivity any other previous This was planned but not obvious, given the complexity of the project.

      1. Tom,
        1. You’ve got radiation belts, Earth’s magnetic field, and scatter from Earth atmosphere/surface/near-surface all adding & skewing background and complicating distinguishing the actual signals. Would seemingly be better to be away from all that, if possible.
        2. Original plan was to return the device to Earth, off the space station after 5 years. (I think it was 5). Now it’s tied to the space station, and dependent on ISS operation (or its termination).
        3. Given the major change/downgrade in magnets, it’s not clear whether there will be adequate data to meet original mission objectives within the current project’s lifespan. Some estimates were that it now might take as long as 17 years. It’s not currently funded for that. And research funds can be/are withdrawn/redirected even from “funded” projects when dollars/euros get tight.
        4. I was speculating/giving a rationale for why “making results seem…interesting/exciting” was especially important for this mission, given the above — and why they may have gone a bit beyond even the normal hype — in light of several bloggers (including Matt) questioning why the AMS team had made claims they couldn’t (yet) fully support. Not arguing that the data is bad, but that “being patient” to get decent results will likely take a lot longer than originally planned, and will need continuous funding to make it happen. So the (politically savvy) science team has to do everything it can to be sure nothing gets in the way — including stretching the importance of early results as far as they possibly can.

        1. Hi. ISS is at 400km altitude, so, there is no atmospheric background affecting the data on galactic cosmic rays. Additionally, the study of non-galactic cosmic rays (albedo particles, geomagnetic trapped particles..) is one of the main experiment’s physics goal (especially for space technology studies), which is possible thanks to the low-earth-orbit environment…

  17. Good stuff Matt.

    You said “if dark matter particles are their own anti-particles, as is also true of photons, gluons, Z particles and Higgs particles), they may annihilate into other, more familiar particles”. Should there be a “not” in there? Alternatively if not, would it be better to say pair production rather than annihilation?

    PS: I would have bolded “If dark matter is indeed made from particles” myself. We don’t actually know that dark/vacuum energy is homogeneous.

  18. Well professor Strassler , that was not nice at all , you know very well who is professor Dr. pavel kroupa of germany……..He is not just Mr. kroupa as per your comment…..maybe this is another proof of the effect of main stream prejudice practiced in science to the degree of down-grading the opposing views !!

    1. As I said, you should read both arguments for and against, and consider them carefully. Mainstream “prejudice”, as you call it, is usually based on good counter-arguments. Tou’d better read the counter-arguments before getting too excited about what Professor Kroupa has to say.

      (As for titles: “Mr”, “Doctor”, “Professor” — this is silliness. If an argument is right it doesn’t matter what a person’s title is. Some of Einstein’s most important work was done before he had his doctorate; same for ‘t Hooft and plenty of others. And lots of professors write lots of garbage. Unfortunately you have to learn which professors you can trust.)

  19. Thanks for that Matt, I wondered why the press release brought SUSY into the discussion at all. Excellent blog, by the way.

  20. No Matt, I’m not citing the (controversial) claims of that paper. Just look at its introduction: it gives a very simple outline of the “standard model” prediction, and it explains why the positron fraction is expected to decrease with energy (according to the standard scenario). This has nothing to do with the author’s claims. The other paper I mentioned (0809.5268) gives much more details, within standard framework propagation calculations, but it is a bit more complicated. More on this subject is in 0701517, a nice review.

    The incontrovertible fact is that we expect(ed) a decreasing positron fraction because of CR diffusion! This has nothing to do with the kinematic
    of collisions p+ISM -> (…) -> e+ that you mentioned. For instance, the B/C ratio is also decreasing with energy (for the same reasons: diffusion) although in the reaction C+ISM->B the kinetic energy per nucleon is conserved…

  21. I don’t think I know enough to even ask intelligent questions. I will read around some and get back to you. Many thanks.

  22. Ian : i never said what you mention , my point is ; what is the end of speculative fine-tuning of models so that it is impossible to be falsified ?

    1. This depends on what we discover. No one has to speculatively fine-tune the geography of the earth any longer, even though someone in 1450 might have asked your same question as to whether we’d ever really know the correct map of the planet. Be patient. Knowledge does not come in overnight.

  23. V nice post, many thanks. An intriguing possible clue if not a smoking gun, be very interested to see the future results.
    The press release seems to imply that if dark matter is indeed responsible, only SUSY dm could account for the signal, which I haven’t heard before (and offers the possibity of a double whammy). Is that correct or is it just the way its phrased?

    1. In fact, if dark matter is responsible, it is very unlikely that standard supersymmetric (SUSY) dark matter could account for such a large signal; you’d need at least one extra force added on. [See http://arxiv.org/abs/0810.0713 for a classic example, which followed the PAMELA results almost immediately (it was also based on an ephemeral result from ATIC); the basic reasoning of why the signal is too large and of the wrong form to be standard SUSY still holds.] And certainly this data will not be interpretable as evidence for or against supersymmetry.

      It is very easy to make models of dark matter that have nothing to do with supersymmetry at all. [In fact, you can bludgeon supersymmetry as a theory, take out most of the superpartner particles, leave in one or two of them, and get a perfectly nice theory of dark matter with no remnant of supersymmetry in the particle spectrum or in the equations.] The link that you read about, in the press release and in many scientific talks and media articles, between supersymmetry and dark matter is basically a superficial argument, a theoretical conflation. In fact, as I often point out to my students, the theory that gives a dark matter candidate is supersymmetry plus a symmetry called “R-parity”. A theory of supersymmetry without R parity has no dark matter candidate. A theory with R parity but without supersymmetry typically has a dark matter candidate. So it is really R-parity, not supersymmetry itself, that provides the dark matter candidate.

      The press release reads as though it was written by experimentalists who know how to talk the talk but do not understand the range of dark matter models that are actually out there in the literature. The theorists who work on dark matter all know this, however.

  24. If the data appeared to be still increasing at the previous rate, they would have announced that. It would have been something clear to announce. Instead they announced that the rate of increase had slowed or stopped and didn’t include the final bins. This suggests to me that the data (including that not announced) shows some sort of drop off and that they are collecting more data so that they can announce which hypothesis the drop off fits.

    1. It’s not impossible. But I think you are over-interpreting. Notice how large the uncertainties are in the highest two energy bins. The uncertainty in the next bin, if they included it, would be significantly larger, so big that you wouldn’t be able to tell between a rapid increase and a sharp drop-off. In short, even if you are right that they think they see a drop-off, they can’t yet be sure that it’s a real effect… you can make the estimate yourself just by projecting the decrease in the number of positrons (even if the positron fraction increases, the number of positrons is decreasing — see the table in their paper) and projecting the increase in the statistical and systematic errors. Furthermore, even when they have more data, there is no way they will soon be able to tell which hypothesis (if any) the drop-off fits, if there is one. Even in the next few years, the uncertainties will be far too large for an unambiguous interpretation. Be prepared to be patient. The data set will only be twice as big 2 years from now, three times as big 4 years from now, etc.

    1. I believe their data is too meager to allow for a clear interpretation; a sharp drop-off, a gradual drop-off, and no drop-off look the same when you have 40-50% uncertainties, which is what they will have in the next bin. We just have to be patient; they need to make sure they know how well they can distinguish electrons from positrons in the next higher bins, and they also need something like four times as much data. I really don’t think we’ll have anything easily interpretable anytime soon.

  25. What about Planck,s detected clusters contradiction – re. Shaun trenches of discovery -?
    Where is the end of model speculatory fine-tuning ?

    1. aa.sh: I don’t think you can be reading Shaun’s blog very carefully if you believe he is pointing to the discrepancy of cluster numbers as evidence against dark matter. Putting arguments in other people’s mouths like that is pretty poor.

  26. According to professor Pavel Kroupa ; Planck does not definitely proves DM or DE……..i believe what he said simply because there are theories that are consistent with data while not speculating any dark physics .

  27. I just read professor Pavel Kroupa papers on ( the crisis of dark matter ) …..
    With so many failures in the SMoC i find it very hard to neglect the probability that MOND theorieS could be nearer to reality …..
    Main stream cosmologists biased effect may be really huge , so in reality the public almost lost all confidence in possibility of knowing the facts , it is a time of no solid ground to stand upon…..true quick-sand .

    1. What we are living in is a time of both discovery and narrowing down. You may prefer that a single experiment deliver results that immediately proves on theory, something closer to the traditional ‘eureka moment’ we like to reduce science to, but nature isn’t always so kind.

      In the last 20 years we have made so many amazing discoveries, exoplanets, exploring distant worlds, pinning down the age of the universe. In some areas all we have done is narrow down our options, but that too is progress.

  28. Reblogged this on In the Dark and commented:
    Here’s a refreshingly hard-nosed take on the recently-announced results from the Alpha Magnetic Spectrometer (which were rather excessively hyped, in my opinion…)

  29. Matt, you wrote that the positron fraction is expected to decrease because e+ have lower energy than the e- (actually protons) that were needed to produce them. No, the reason is related to cosmic-ray diffusion. Very simple arguments are given in 0903.2794 (more refined in 0809.5268): it shows how the e+/e- ratio, according to the “standard” scenario, is expected to follow the energy dependence of the diffusion coefficient in the galaxy.

    1. Well, now you’re citing a controversial paper; this paper makes claims that I know aren’t widely accepted. I can’t incorporate such things into my posts. If you have a consensus review or introduction to consensus views on the subject, by a well-known expert, I’ll be happy to read that.

  30. Fermi data are not consistent. For one point, 2-sigma is not serious. But here, all Fermi data-points are systematically higher by some factor. Let’s simplify: those data are wrong.

    1. That’s what I’m saying, Tom — you are neglecting the *correlated* errors. There is an overall normalization error, which moves all points together up or down.

      This happens all the time in LHC contexts. Getting the normalization of your data right depends, for example, on measuring the collision rate of the LHC. If you have the collision rate off by four percent, ALL your data points move up or down by four percent… and that is still a 1 sigma effect, independent of how many data points you have, because of the correlation between the various points. Looking at the uncertainty bars as though they are uncorrelated will lead you to the wrong conclusion regarding what the experimentalists did and did not promise you about their data.

      In short, I believe the FERMI folks, in their discussion of their uncertainties, stated that you should always have understood the FERMI data as allowing all point to move up or down by some amount. This has to do with the way their measurement was done (it wasn’t part of FERMI’s primary goals and the technique used was unconventional.)

  31. An interesting summary of both current and previous results. Looking forward to more data from AMS.

  32. Hi Matt,
    AMS should be Alpha Magnetic Spectrometer. Does it have any muon detect-ability?

  33. If dark matter has only gravitational interaction,cross section for annihilation production of electron-positron pair will be extremely small, perhaps unobservable! Perhaps weak interaction is not ruled out. Is that right? What do you think the order of magnitude of cross section would be?

    1. If dark matter has only gravitational interactions, then annihilation will almost never occur. If dark matter has other interactions, but nothing except gravitational interactions with ordinary matter, then even if annihilations occur, electrons and positrons won’t be produced.

      Weak interaction cross-sections are somewhat too small to produce the number of electrons and positrons that are seen by PAMELA, FERMI and AMS, at least without generating other effects that should have been observed. So probably some enhancement of the cross-section would be necessary — assuming the excess positrons observed by these experiments are due to dark matter, which is not certain.

  34. When you say: “Despite what you may read, we are no closer to finding dark matter than we were last week. Any claims to the contrary are due to scientists spinning their results (and to reporters who are being spun)”, I like so much your sincerely posture! Its a rare gesture. Congratulation.

    1. I’m not the only one; also check out Resonaances blog. I suspect Sean Carroll at “Preposterous Universe” has the same opinion. And on my Facebook page where I chat with scientists, everyone says the same thing.

  35. Agree: the Fermi data were ignored by media; but they are NOT consistent with new AMS data. I think it is fair to say that the Fermi measurement is heavily affected by undetected systematics (or, in other words, it is “wrong”).

    1. Within the error budget, they agree with PAMELA and AMS within two standard deviations. That is, they are not wrong; I believe you are not accounting properly for their correlated uncertainties, within which they do agree.

  36. Yes Matt, on point (2) I meant diffusion, which is known to steepen the cosmic-ray spectra. Propagation effects are same for e+ and e-, but the electrons come directly from some primary sources (say, SNRs), so their initial spectrum is the SNR source spectrum. On the contrary, positrons come from collisions of equilibrium protons which, in turn, have already experienced diffusion processes. Thus, the e+ “source spectrum” is steeper than the e- source spectrum… and the reason is the diffusion.
    That’s why the positron fraction, like the B/C ratio, is expected to be decreasing with energy.

  37. The AMS result looks very exciting. A plateau at 50GeV to 80GeV, a small peak at 90GeV and a big peak at 125.(6)GeV. Looking at the graph I would tend to disregard PAMELA data at >80GeV and error bars from FERMI give a feeling of shooting in the dark. It is early days but what can you conjecture?

    1. You are seeing all sorts of structures that aren’t present. If you look at the AMS paper, they claim to see no such structures. Why not? Because when you have as much data as AMS has right now, statistical fluctuations will generate fake structures of the size you claim to see.

      The reason scientists employ statistics methods is precisely so they can try to distinguish fake structures that are due to not having enough data from real structures. According to AMS (and consistent with what my eyes see), there is no structure in the data that is statistically significant other than the overall rising positron fraction and a tendency for it to flatten out at high energy.

      There is no reason to disregard PAMELA and FERMI, which have large uncertainties, but are certainly not shooting in the dark. And as you see, PAMELA especially got the right answer; their data is perfectly consistent, within its quoted uncertainties, with AMS.

  38. Matt: “There’s the potential for false negatives. Dark matter may be made of particles that don’t annihilate or decay. …
    Meanwhile, false positives are possible too: … ; pulsars have been suggested as a source.”

    In addition to many great reasons you have listed in your article, there is another view point to address this dark matter issue.

    LHC, Planck and AMS are great gadgets, and they produce great facts (data). However, these gadget facts are differentiated truths. Integrating them is more important than those truths themselves. Yet, there are many different ways of integration.
    a. Hodgepodge integration — the similar truths are mixed together to form a hodgepodge, such as with the discovery of electron, proton, quark, …, neutrino, … to construct the Standard Model.

    b. Mission-based integration — for particle physics, it has, at least, two types of mission.
    i. Public mission — to explain:
    1. How did life arise?
    2. How did universe arise?

    ii. Physicist mission:
    1. How to unify the quantum and determinism.
    2. How to encompass the gravity.

    However successful the Standard Model is as gadget facts, it is a junk if it fails on the above missions. On the other hand, if a particle theory is a mission-completed theory, it will definitely be Nature physics, as it has integrated over many gadget truths and fulfilled the required missions.

    Mission-criteria epistemology (based on integrated facts) is more powerful than any gadget fact based epistemology (as only differentiated truth). The mission-completion is the only criterion for particle theory.

    Can this AMS data make the SM/SUSY mission-complete?

  39. In view of all this, and in view of the fact that it was all well-known long before this, what actually is the point of this experiment?

    1. a) First, the experiment is late, partly because of being dependent on the space shuttle and the space station. Had AMS been on schedule, it would have beaten its competitors and made these discoveries, and you wouldn’t ask this question. Losing out on a discovery is a common cost of running late. The point of the experiment is therefore less sharp than it would have been back in 2006.

      b) Second, the fact that the experiment already has made much better measurements than its predecessors gave it the possibility of discovering something the others hadn’t seen. Clearly that was part of the point, even though the experiment was late. The fact that there’s nothing obviously new in their current data is something that couldn’t have been known until today.

      c) Third, over time AMS *will* be able to do things previous experiments can’t do at all. So discoveries are still possible. And that’s clearly another point.

      But you’re right to ask how this experiment lost so much of its advantage over the past decade. That’s not a question I can answer.

  40. Re: “AMS finds that the positrons seem to come equally from all directions”
    Wouldn’t one expect high energy positrons from dark matter annihilation to have a pronounced peak towards the galactic center?

    1. This is complicated. The galaxy’s magnetic field scrambles the directions of charged particles in all directions. Nearby pulsars that might produce positrons might be so close (relatively) to earth that perhaps the magnetic field hasn’t managed to do this.

    2. I think that simulation of the dark matter distributions to give rotation curves (angular velocity vs distance from centre) like those seen for galaxies suggest that dark matter does not congregate particularly at galactic centres. Sorry I can’t think of a reference off-hand for this but I think there is something in Sparke and Gallagher and probably other standard texts on galaxies.. However, I am sure that there is a strongly-held contrary opinion somewhere in the literature.

      Regards,

      david perkin.

      1. No, indeed, dark matter does congregate at the center. Yes, there is a large halo of dark matter extending outward far beyond the stars of the galaxy, but there is a relatively sharp concentration of dark matter at the center of the galaxy. And the rate for annihilation is proportional to the square of the density of dark matter particles (since two such particles have to find each other.) So the annihilations (if they occur) are much more concentrated at the center of the galaxy than the dark matter itself. (They may also be found in dwarf galaxy cores, and possibly in scattered dark matter clumps.)

        1. The main assumption is that DM does have gravitational interactions with “standard” matter, so, it should be expected that in the presence of gravitational pulls, it will behave much like “standard” matter, congregating around strong gravitational fields, like at the center of galaxies.

  41. Matt: “Despite what you may read, we are no closer to finding dark matter than we were last week. Any claims to the contrary are due to scientists spinning their results (and to reporters who are being spun).”

    Amen. It is nice to have a cool-head in this news-hyped world.

    1. Thanks. The thing that is really pissing me off (and surely not only me) is how many stories in the news media are completely neglecting the earlier work of PAMELA and FERMI — as though they never existed — even though theorists have been discussing their work actively, and writing hundreds of papers about their results, for four years!

  42. Actually the plot seems to have already a small drop just at the mass of the W particle… Perhaps in three years they will be able to detect the W and Z bosons 😀

    1. 🙂 — well, if they manage to detect W and Z bosons out in space, that will be a huge discovery! Don’t they have lifetimes of 10^(-24) seconds?! 😉

        1. Of course a W produced in the initial process (whatever it is) will decay… to electrons, at the end. Thus giving an extra contribution to the spectrum of positrons.

  43. Hi Matt. You wrote: “most positrons are expected to produced only as a by-product — when the high-energy electrons hit something out in space — which means the positrons would generally have lower energy than the electrons that were needed to produce them”.
    In my opinion you are wrong twice:

    1. Most positrons are produced by *protons* collisions, if any. Not electrons.
    2. The positron spectrum is steeper due to diffusion processes in the turbulent magnetic fields: it has nothing to do with the kinematic of their production.

    1. You are right on point 1; I was simplifying to keep the post short, but indeed, I should have said that I was doing so.

      On point 2 — I don’t yet understand what you are saying. Diffusion in the turbulent magnetic field would affect positrons and electrons the same way — so presumably you mean something other than what you said.

  44. So there was a news some time ago that there is an excess of gamma rays from the galaxy center at 130 GEv that was attributed potentially to dark matter. Do we have a match with this result? It looks for me probably not.

    Thanks for all the info you gathered here!

  45. Prof. Ting in today’s NASA press conference suggested that a sharp drop in the positron fraction above 350 Gev would be a signal that indicated dark matter, while a slow drop off would suggest another source, such as pulsars. Do you agree?

    1. Dark matter need not give such a sharp cut-off; Ting’s assuming a simple model for the annihilation which is not very likely to be correct. It is true, however, that pulsars are very unlikely to give a sharp cut-off. So yes, a sharp cut-off would likely be from dark matter, but I doubt that’s what will be seen — and it may not be clear, even with 10 times the data (as they’ll have 20 years from now), just how sharp the cutoff is anyway. This is where the ability of the experiment to do accurate energy and charge measurements for the highest energy particles will be critical.

  46. One question concerning dark matter, as you explained here on site particles are excitations of fields, but can’t fields also support other kind of excitations which while not being particles still exert gravitational influence?

    1. I’m cautious answering your question because I’m not sure what you’re thinking of. Some types of fields with appropriate types of forces can have particle-like excitations (such as magnetic monopoles) that aren’t particles in the simplest sense but still behave like particles in many ways. In principle dark matter could be made of such things (though not specifically magnetic monopoles.) But there might not be much observational difference between these particle-like things and simple, garden-variety particles. Other sorts of excitations of fields don’t generally give the right gravitational effects… but maybe there are examples I’m forgetting about. And it isn’t yet established that dark matter isn’t made from black holes, as far as I know.

      1. Ok, thanks. I was thinking about anything that cannot be localized but mostly excitations with a different mass profile, not a sharp localized peak where all the mass is located but some other function, for example a peak with a long tail, or a gaussian, etc.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Search

Buy The Book

Reading My Book?

Got a question? Ask it here.

Media Inquiries

For media inquiries, click here.

Related

Particle physicists describe how elementary particles behave using a set of equations called their “Standard Model.” How did they become so confident that a set

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 11/20/2024

If you’re curious to know what my book is about and why it’s called “Waves in an Impossible Sea”, then watching this video is currently

Picture of POSTED BY Matt Strassler

POSTED BY Matt Strassler

ON 11/04/2024