London, it seems, has much better weather this week than my home town of New York. Too bad I had to spend most of the day at a desk, working on my 20-minute Powerpoint presentation for tomorrow’s scientific meeting, on how to improve and expand the searches for supersymmetry at the Large Hadron Collider (LHC) so that they can cover variants of the theory that the standard searches for supersymmetry don’t address very well. (You can read a bit more about what we do and don’t know right now about less popular variants of supersymmetry by looking at this page and this one; this sequence of articles is still quite incomplete, thanks in part to certain pesky neutrinos.)
I expect an intense day tomorrow. The presentations will be short, leaving a lot of time for discussion. It’s one of my favorite forms of scientific workshop — one where an exchange of ideas can actually lead to new policy and strategy.
Meanwhile, I have been getting great questions from non-experts in response to my “Summary and Open Space for Questions” post regarding the OPERA experiment’s early-arrival neutrinos. A number of people have prefaced their comment with “this is probably a dumb question but…” Well, I have to say that I have scarcely seen a dumb question yet among the ones I’ve been receiving. A dumb question, by my definition, is one whose answer you could have figured out for yourself. That’s not to be confused with an honest question, one that stems from a lack of knowledge. And a smart question, in my book, is one that gets right to the heart of the matter. Some of you who know rather little about the science are asking such smart questions that I have had to go think about the answer for a while before figuring out what I should say.
Of course there’s been a lull in my answers since yesterday, as I’ve been far too busy with other things. But answers will come!
One more thought: many of you are asking questions of the form”What do these faster-than-light neutrinos mean? Is it a sign of extra dimensions? Are they tachyons? Will Einstein’s theory require major modification?” Well, first of all, probably they mean nothing, because probably the experiment is wrong. And if it is right, it is hard to know what it means, because the OPERA measurement provides us with only one piece of information — the early arrival-time of neutrinos produced in pion decay carrying an energy of around 20 GeV. Yes, we know a bit more from other experiments that don’t show a similar effect, but still information is very limited right now. Historically, when a breakthrough has occurred, there has usually been a lot of information, either experimental or theoretical in nature, which was available for scientists to use in making the big step. And still that step often took a couple of decades or more! So personally I think we’re running far ahead of ourselves here. We humans are not very smart, and we often require a lot of hints from nature before we get the point.
11 thoughts on “London, LHC, and Neutrino Questions”
I like the Legal terms (legal court battle: Prosecution VS Defense, decision comes down to weighing eachsides evidence, picture: “balance scale”)
“smoking gun evidence” [ aka irrefutable evidence, hard data ]
“preponderance of evidence” [ strong circumstantial case ]
Physics (& Science in general) moves in a conservative fashion, weighing the evidence (set of data-points) & building theories. An iterative approach, where data-points & theories are finalized/honed-down. Call it CONVERGENCE. Takes TIME, doesn’t happen overnight! Typically, a long drawn out process.
“Slow & Steady WINS the race”
— Tortoise VS Hare
[ Hare jumps the gun, gets out front, burns out…tortoise keeps a steady pace, finishes, & wins ]
It’s a universal concept “In order to Finish First..first you must FINISH”, a successful theory is a WIN, which requires perseverance & execution. Rarely, does a discovery happen overnight. It’s like a fine wine, requires TIME to mature.
I (& others) have looked at the OPERA paper, it seems to have problems. Doesn’t track “sigma” along the entire signal path (analog & digital), especially Quantization Noise. The cumulative effect of latter can be very problematic, in Computer Graphics it can lead to flat-out SPECTACULAR failure (e.g. hidden surfaces becoming visible!!) Generic use of generic Statistical Models: independent Gaussian PDFs (which are assumptions), Sergei Petrov already pointed out
“The fact that the authors of the OPERA paper combined statistical and systemic errors as if they were dealing with linear system [ aX + Y ] and normal [ Gaussian ] distributions tells we that they did not question if their statistical models for estimating errors are applicable at all.”
Another thing, is the TT (Target Tracker)/FPGA system used for time-stamping data, the master OPERA clock (20Mhz, derived from atomic clock of GPS) feeds into TT/FPGA, to create an internal clock (via PLL/Phase Locked Loop). Not clear if this is a source of “jitter”, we (myself & others, incl FPGA experts) are still querying OPERA co-authors about this. If asynchronous input..problems.
Bottomline: the tracking of “sigma” in signal path could be underestimating the errors, making the result invalid (buried in the noise/”sigma” of the system)
Since the following question is pretty important for my understanding of nature and since i already asked it 2 times each time too late to be read i will ask it again in hope to get an answer this time:
Can you please explain this to me why do processes like the one depicted in this picture: http://imageshack.us/photo/my-images/97/feynmandiagram.png/
not happen? The opposite process could happen right after that after a very short while so it should not be noticeable and so should not contradict measurements. However having such a process take place should lower the speed of a photon slightly below the speed of light. Since it probably has a low change of happening and since the particle-anti-particle state will decay into a photon again fast. It should only lower photon speeds slightly . Also i think this kind of process is compatible with the standard model. Momentum conservation, energy conservation, spin conservation and particle symmetries should not be violated. Light should have a mass for short probably not measureable moments. However this shguld not be an argument that is not even wrong since this stuff should be observable in photon speeds. To be more specific i predict that photons with rising energy will have oscillating speeds slowly dropping in speed. Why do i predict this? First to explain the speed-drop: The higher the photon’s energy the higher the masses of the particle-antiparticle pairs the photons can decay in will be also there will be a higher number of such kind of particles making such a process have a higher chance. The oscillation will happen because every time the photon can shortly decay in a new particle -antiparticle pair the speed of the pair will be lower with rising photon energy however it will rise too, however a speed drop will occur each time a new particle-antiparticle energy+energy needed for momentum conservation is reached
I would very much appreciate an answer. If this question is stupid tell me also and maybe why it is stupid. To me however this seems to be in line with particle behaviour approximation methods in gauge theories.
i would be thankful if you answered this question
I feel I have answered this question a couple of times …hmm, are the questions disappearing off the webpages now? (The system is still new to me.) The point in the end requires a calculation: the answer is, NO, the photon does not pick up a mass from quantum loop effects. This is *not* instantly obvious but it is true. There are a number of errors in your reasoning, but in the end the answer comes down to the fact that the photon interacts via conserved currents, and all the effects that you describe cancel out. The main effect of the process you describe, in the end, is on the force the photon exerts, which turns out to depend on distance in a way that is no longer exactly 1/r^2.
I’ll take a shot at asking a stupid question.
Can anyone tell me why the OPERA data can not have been caused by gravity waves?
There are many possible answers, but here’s a simple one: anything (including gravity waves) that could affect space and time themselves at the level claimed by OPERA would screw up the GPS system.
Hi Matt. Thanks for the answer.
But i was thinking more about distance travelled/elapsed time on the ground, than the gps system.
The LIGO experiment came to my mind. Pity OPERA only has one beam.
Right — my point was that LIGO is an extraordinarily sensitive instrument and can detect incredibly faint gravitational waves, whereas any gravitational waves big enough to have any effect (leaving aside the fact that it would be the wrong effect) on OPERA would be large enough that we would have detected them long ago.
has any peer group tested the SOL particle? has the experiment been done by any other facility? if not now…then when?
have other tests been done yet as of 11/19/2011?
See today’s post (11/20) and work backwards; my post http://profmattstrassler.com/2011/10/26/a-few-tidbits-from-nagoya-including-opera-news/ answers some of your questions, and look in the comment section for words about the MINOS experiment’s plans. Let me know if those posts don’t answer your questions.
The OPERA team should be congratulated not for beating the speed of light, but for being the first to measure the rest mass of a neutrino. Presumably the neutrinos were (incorrectly) considered to have been sent at the time of impact. However the Eigenfunction of the neutrino would become significant when the distance between colliding particles is in the order of the wavelength of the neutrino at rest. This happens at the time 6E-08 seconds before the impact. The frequency ν (nu) of the neutrino is the reciprocal 1/(6E-08) seconds. From mc^2 = h ν (nu) the rest mass of the neutrino can be calculated approximately 2E-41 grams.
Comments are closed.