I like this. Most of the time the reporters overstate the research, the scientists keep the data secret, and the general public is left scratching their heads.
Second, they compared the neutrino results with light results. Third, they ask to replicate the results in some other environment. This is a normal scientific procedure.
Please also note the number and variety of authors on this work. They will make sure that there is no overstate, understate, secrets, or head scratching.
I am expecting one of two things to happen:
a. A systemic error in the measurements, which is most likely IMO.
b. a modification of physics similar to how relativity was a modification of Newtonian physics.
Most of the time reporters misunderstand the research and then oversell it in order to get their story read. The general public is usually left scratching their heads because of this bizarre layer of clueless journalists between them and the scientists. Yes, some scientists are hard to understand, but there's no hope in trying to understand particle physics filtered through the soft spongy grey matter of a liberal arts major! Scientists are quite often educators who are used to speaking to a wide range of audiences. Journalists would do far far better to introduce the subject and then let the scientist explain it in their own words. I have no idea why this is not done more.
As for the FTL neutrinos at CERN... I'd hold on to skepticism for now. The reason they're asking for help finding the glitch instead of proudly proclaiming that they've found tachyons (FTL particles) is because this would be a truly bizarre result that would upend a century of physics that has, to date, proved remarkably accurate in its predictions. If this discovery proved true it would literally force us to rewrite physics from ~1900 onwards. Physicists are not afraid of this. Quite the contrary! It would touch off a bonanza of research! There would be so much work to do that the field would literally explode. It's just a lot more probable that there really is a glitch somewhere and we simply haven't found it yet.
This actually would not be the first time that tachyons were "found". A group in the 70's found statistically significant evidence for them preceding cosmic showers. They were so sure of their results that they published. Then, a year or two later, they discovered that it was just a glitch in their electronics! These guys were small fry compared to CERN, but CERN's accelerator is so complex that glitches could be very hard to find. The scientists at CERN are clearly out of ideas and mean just what they say: They don't believe their results and they want help finding the glitch that's probably causing them. It's extremely irresponsible to report this story as if FTL particles have really been found before CERN has had a chance to consult with outside scientists and other facilities have had a chance to try reproducing their results.
Publish != tell the media. Publish != share raw data.
I have yet to find encounter a group in my field that refused to share raw data when asked for it. They're usually happy to have another set of eyes look for a meaning they didn't publish (This could lead to another paper they might not have to work very hard for!). Again, is it really secret if all you have to do is ask for it? Corresponding author email addresses are on papers for a reason!
I will say that it is unusual for a group of scientists as prominent as the ones at CERN are to go to the media before publishing, but they are clearly very unsure of their results and would rather have a lot of eyes looking for bugs before they commit to publishing their results. This, of course, won't accomplish much if they try to keep secrets, as you are suggesting.
In fact, here's a challenge for you. Go ask CERN for their FTL neutrino data. I bet they'll be happy to give you a copy. They'll probably even have a grad-student to spare to explain some of it to you, although his or her time will probably be at a premium for the next few weeks!
About the 70's group, can you share the link to the article?
on the flipside, i would be certain that certainty is a particular kind of hell, if having such knowledge didn't condemn me to it. excuse me while i proceed to consume my tail http://en.wikipedia.org/wiki/Ouroboros
I so so digg your 'the more we learn, the less we know' dialogue.
10: Question Reality
20: Verify Results
30: GOTO 10
if theory_of_everything_appears or strong_ai_appears:
raise Exception("rethink loop")
In scientific studies, a question is made and an answer is given. That answer then leads to newer questions further probing the ultimate. One can approach, but never completely know.
In the Occult, one accepts a maxim or answer as an answer. Questioning hinders or derails to what is desired. I am speaking of the ultimate goal of the study of the esoteric: understanding of self.
(You mentioned an occult icon, thereby opening this line of commentary)
I forget the source, but they later tested their setup with random input and found it predicted tachyons from that as well.
Um, no. There are plenty of data being released, and in even more cases the scientists are happy to give you the data if you ask. It's very dependent on discipline, though. In astronomy, pretty much all data obtained with publicly funded facilities (e.g. including all large space instruments) are accessible by anyone after a proprietary period of 1 year.
About the later date: I think it's reasonable that those that put together the proposal and thought up the project have a chance at getting their results out first. Otherwise, why would you go through the effort of planning and executing observations if you could just poach someone else's data as it comes down the pipe? Think of it as a 1-year patent on the research project.
Some argue this isn't good enough (and I have some sympathy for those arguments). But it's very different to a deliberate action to hide or obscure information.
That said, it's very true that this is worlds different from maliciously withholding data.
Go down to your local university library. Go over to one of the computers (or hop on the wifi). Read the article. Email it to yourself. Not that hard now was it?
This is not ideal, but it's not that you _cannot_ access it, it's just not convenient.
Which is less cautious and more of the standard fare of not-very-well-reported science.
In a better world, that would be true.
In this world, it's not.
Most of the climate science data that Phil Jones and others have worked with, and most of their models are secret.
It is THIS that makes me keep at least one ear open for the "anti-" side in the climate debate. Real science SHOULD be conducted out in the open.
For example, the three models developed by NASA: http://data.giss.nasa.gov/modelE/ar4/
Direct links to each model + code + datasets:
http://www.giss.nasa.gov/tools/modelE/ - with nightly code snapshots here: http://simplex.giss.nasa.gov/snapshots/
http://aom.giss.nasa.gov/ - with direct access to code here: http://aom.giss.nasa.gov/code.html and here: http://aom.giss.nasa.gov/code4x3.html
http://www.giss.nasa.gov/tools/modelii/ - seemingly work on this has now been merged with the ModelE simulation.
Lots of other code and data is available if you take the time to find it ... and then considerably more time to understand it.
Still, in the larger context validating someone's data is basically worthless activity. It's far more important to go out and collect new data, build a new model, run a new simulation, and report those results than it is to try and shortcut the process. The data deluge is making it way to easy for people to validate bad science using the same bad data / model as someone else. Fundamental assumptions in statistics break down when you use the same 10 coin flips to substitute for the 10 million coin flips that cost more than your willing to pay for.
PS: The failure to collect significant quantity's of new data is a hallmark of fringe / bad science. If you really think global warming is a joke or you can be more healthy by eating coco puffs then go collect that data and do your analysis and see if someone can poke holes in your process.
The larger model is based around how that the extra heat maintained gets distributed around the world and everyone knows this is far less accurate. The people building this are well aware it's highly limited but by trying to balance the errors in both direction they hope to create a good estimate. The problem is when you start tweaking the heat distribution equations it's easy to find errors in every direction because they are greatly simplified so it's easy for someone to introduce systematic errors by only correcting issues in one direction.
Which is why you provide a high level description but when you release your source code you open yourself up to a lot of fruitless debate. Sort of like someone saying based on your methodology the rocks are really 1/1,000,000th a year older which means your science is a joke and less valid than my consistent theory that god created the world 8 weeks ago. But, if you actually build a model from scratch under similar assumptions it's going to have similar outputs and error bars or look really biased.
I don't think they're "secret" i think they assume anyone they haven't heard of who is asking for the info is a troll.
Business-sponsored science is very often secret, but please don't attempt to paint, as the parent commenter disingenuously attempts to do, that scientists are the root of data secrecy.
There have been now six separate inquiries into the misguided, corporation driven, politically-motivated and frankly dumb charges that climate science data is faulty or is associated with some kind of scientific misconduct. All six separate inquiries indicated it was not:
Moreover, scientific organizations are overwhelmingly making changes that makes their data even more accessible:
First of all, "climate science" != "science as a whole".
Second, if any scientific field ends up in the crosshairs of a well-funded politically motivated movement working against it at any cost, then it will become hostile towards skeptics and eventually start stonewalling them. This happened in biology re: critics of evolution (in the anti-evolution camps there are tons of complaints about censorship, refusal to address criticisms, provide data, share models and code, etc.), and whereas it's always possible to address criticisms, the other side is always willing to spend more money to poke new "holes" in the theory, and you'll just never win.
"Skeptics" like to portray the ultimate goal of science as if it's to continually try to poke holes in existing theories, because progress is made when theories are disproven. But that's not science. Science is when you mix a desire to poke holes in theories with a reasonable threshold of acceptance, a willingness to step back and say "Okay, we've spent a lot of energy coming up with arguments as to why this is wrong, and they've been addressed satisfactorily, so this is probably right, let's see if this takes us anywhere new rather than getting stuck arguing the point".
Climate skeptics lack the willingness, even in theory, to accept that enough is enough - I don't believe for one moment that there is, even in principle, any piece of evidence that would lessen the amount of criticism against AGW. I don't believe they would give in because we've already seen that no amount of evidence reduced the amount of anti-evolution noise, and it's by and large the same sources of funding and political motivation that's behind both movements.
Now, it very well may be the case that climate scientists also lack the willingness to admit that they're wrong (actually, I think this is pretty likely), in which case they're not doing science, either, and that's bad. I'm totally in agreement there.
But they're taking exactly the approach to the skeptics that they would (and arguably, should) be taking if they were doing proper science. So the way they deal with skeptics is not a point of evidence either for or against the honesty of their scientific investigations, it's merely an inevitable result of the politicization of their field.
If you can't make an argument without resorting to name calling in the first line, please excuse me while I disregard anything at all you have to say. kthxbye.
This has happened in other fields.
In this case, it's "this fact that we've believed to be completely fundamental to physics for the past century appears to be wrong," and if you stated that without asking for people to check your work carefully because you are puzzled too, people are going to laugh at you if it turns out that you were wrong somehow. I'm not sure how you really could report this without downplaying it; it's just too incredible a claim for people to believe if you did anything close to overstating it.
Inherently impossible, a troubling myth that is made all the more troubling by the fact people believe it.
Impartiality is gone the moment the camera is aimed over here as opposed to over there.
Until it steps on the state's toes.
I doubt funding can ever be divorced from control, even if it is indirect. The BBC knows where its next meal comes from.
That said, the GP does have a point. The only thing preventing the possibility of closed-door deals going on between the BBC and political parties is good faith. Ergo, if it's not already happening or hasn't already happened, which is unlikely, there's no reason it can't happen in future.
Note: despite Australia's high exposure to UK content, I don't know much about the British media landscape. I haven't read the BBC's charter (nor the ABC's, for that matter).
It's a tradeoff of different goals. On one hand, you get a news agency which can focus on news - not entertainment as would be dictated by a comittment to profit. On the other hand, that news agency is more susceptible to decisions by government. Consider: what determined the BBC's spending cuts?
I think publicly-owned news organisations are a great idea. The ABC, BBC, and NPR usually provide great news coverage. They have no need to sensationalize content, and that services the public to no end.
Political parties could equally make backroom deals with private media companies as well. It would also be considerably less risky, because they wouldn't have to violate the Royal Charter.
The problem with backroom deals concerning the license fee is that they have to be remain secret, yet affect how the news is reported without drawing suspicion. Any control would have to be very subtle.
An government inquiry into the BBC's reporting would be even worse, as it would be a clear violation of the BBC's independence.
That's why you don't reduce it right away. You threaten to reduce it if they're impolitic enough to pursue certain courses. You make this fairly clear to them. The actual decisions come from the BBC's own leadership, acting on such knowledge.
> An government inquiry into the BBC's reporting would be even worse, as it would be a clear violation of the BBC's independence.
Freedom's just another word for nothing left to lose. The only independence the BBC can have under its funding model is if it decided to risk going downhill and no longer being able to do the good things it's doing now. A reduction in license fees would accomplish that, meaning if you can wave that stick around the BBC will waive its independence and allow investigations.
And the next day there's a headlining report on government corruption.
I'm not quite seeing how the government can threaten the BBC, when they have so much to lose if those threats become public.
Plain and simple, this is most likely due to a systematic error in their experiment that isn't being properly taken into account. The result would tear apart well established theories that have been tested time and time again in thousands of different ways. Of course that doesn't mean that they're absolutely right but it does mean that any contradictory result has to be initially taken with a grain of salt (kudos to the article writer for doing this). It's easy to mess up a calibration in such a complicated system and 60 nanosecond errors could potentially pop up. It will be interesting to see the results from other labs but I would advise against getting your hopes up for any new physics.
Could a physicist clarify whether neutrinos could travel at different speeds in vacuum like any 'normal' particle, or are 'fixed speed' like photons. We could well have had several waves with the bulk arriving at c-epsilon and no one bothering to check outside the expected time frame.
If neutrinos have mass, they can travel at any speed <c depending on their energy, like any other massive particle. Of course, the upper mass limit is quite low, so even with very little energy they will go very fast.
(Edit: In fact, the arrival interval of those neutrinos puts an upper limit on the neutrino mass, because the larger the mass, the more different the speed and the more dispersed the arrival of them at Earth.)
It could be some phenomenon similar to what we see in emission spectrums, where a pure source can emit photons at multiple precise energy levels. except that in the case of particles with mass(apparently neutrinos have a mass) this would result in a difference in speed. Not a big enough difference to be noticed on 700 km , but enough to make a 4 yr difference on 170kly.
Of course this is still wild speculation without much background.
That's what makes this so exciting!
Failing that, I'll at least wait three years for the "Ohhh yeah, so we checked again and it turns out that [simple explanation X]".
On the offchance that this is a startlingly new breakthrough in physics, I don't lose much by waiting three more years before I believe it.
If you assume nothing travels faster than light and you have a burst of neutrinos a couple of years before an event, you'll look for other explanations or maybe simply ignore the readings. Generally they expect the neutrinos burst a couple of hours before the photons arrive.
Now it's suspected that a flavor of neutrinos can travel faster than light, maybe there's going to be some data digging to find correlations...
The universe if full of mysteries and it's highly likely Einstein's theory although verified and correct in many cases is incomplete not to say totally incorrect in other cases, as Newton's theory was.
Let's not assume anything.
I very much doubt they completely vacuumed the path of the neutrino to the same degree as it is in outer space. It is possible that neutrinos travel at the same speed as light in vacuum, but travel much faster when there's other material present in the way.
Similar thing could be happening here.
I bet you there is a systematic error, but I don't think it's this one.
My general rule with physics is that if I can think of it, then a real physicist will laugh at it's triviality.
The common example is the beam of a lighthouse. Suppose a lighthouse revolves once per minute. At one lightyear distance from the lighthouse, the angular velocity of the beam is 2*Pi lightyears per minute, which is much faster than the speed of light. However, this is not a problem: the beam is not a physical entity. There is no single particle actually travelling faster than the speed of light. It's just some construct in our minds to which we assign that velocity.
Something similar is happening in the experiment described by your parent and possibly in the experiment in the linked article.
Actually, that's the linear velocity. The angular velocity would be 2*pi radians per minute and is independent of distance.
But it's pretty easy to check that everything is alright with group velocity. If there is zero neutrino production at t=0 (in the reactor's frame of reference), and then the production jumps up, there can be no doubt about the group velocity.
Bizarre effects relative to superluminal phase velocity only appear when you start generating particles before t=0 and you mess with the medium in between. Then the "superluminal" illusion is created by quanta created before t=0.
TLDR: I find it bizarre that seasoned professionals could be fooled by phase velocity or a similar effect.
Yes, thank you. Why did it take me so many words? :)
In the early 1980s, first measurements of neutrino speed were done using pulsed pion beams (produced by pulsed proton beams hitting a target). The pions decayed producing neutrinos, and the neutrino interactions observed within a time window in a detector at a distance were consistent with the speed of light. This measurement has been repeated using the MINOS detectors, which found the speed of 3 GeV neutrinos to be 1.000051(29) c. While the central value is higher than the speed of light, the uncertainty is great enough that it is very likely that the true velocity is not greater than the speed of light. This measurement set an upper bound on the mass of the muon neutrino of 50 MeV at 99% confidence.
The value looks awfully like what we have in front of us today, but the uncertainty was too big to investigate further
EDIT : actual paper :http://arxiv.org/abs/0706.0437
OPERA needs an intense and energetic beam of muon neutrinos traveling a distance of hundreds of kilometers to seek for the appearance of oscillated tau neutrinos. A beam of this type is generated from collisions of accelerated protons with a graphite target after focussing the particles produced (pions and kaons in particular) in the desired direction. The products of their decays, muons and neutrinos, continue to travel in generally the same direction as the parent particle. Muon neutrinos produced in this way at CERN cross the earth crust reaching OPERA after a 732 km journey.
I didn't mean to try to 'sell' anything, though I agree that my comment was needlessly biased here. Revised it accordingly.
What does the "(29)" in "1.000051(29) c" mean?
So 1.000051(29) would be equivalent to 1.000051 +- 0.000029.
For example, they measured 1.00005129c, but maybe their uncertainty is +-0.0000001c. The last digits are unreliable, but are the measurements they got
EDIT: below, its claimed that those digits ARE the uncertainty. That quite possibly is it; my answer is from what i remember from uncertainty in my 1st year Physics course.
edit: sorry if that's incorrect in this case. "1.000051(29)" is one of the formats of indicating repeating decimals, per http://en.wikipedia.org/wiki/Repeating_decimal#Notation
Would someone with some knowledge of physics care to break down the ramifications of this (if it's not some sort of measurement error)?
Special Relativity holds only for space with static and continuous metric (to preview the complexity of the theory otherwise, try to play with Maxwell equations where integration would happen over non-static volumes/surfaces :) . The space expansion doesn't fit the premises of Special Relativity and thus there is no contradiction. Special Relativity is pretty artificial theory which applies to the true world of our expanding (ie. non static metric) Universe only by approximation - this approximation works only as much as the speed of expansion is slow and either time period under consideration is small or distances are small (but not very small as the space continuity premise of SR seems to get broken as we get to Planck distances)
In this particular case of neutrinos, i think, if they integrate Earth gravity time dilation variations along the neutrinos path, it would be correction on the scale around this billionth of second.
Isnt that effect in the reference frame of the neutrinos and not the measuring instrument(I'm not too sure about this)?....And also wouldnt it mean that the neutrinos take more time instead of less time?
Consider what we know about the Universe but can't fully explain so far. We know that space-time is dominated by a "dark energy" that may be due to background quantum vibrations or some as yet completely unknown phenomenon. We know that most of the mass of the Universe is bound up in dark matter which at present seems most likely to be a weakly interacting massive particle, specifically neutralinos (a supersymmetric particle). Neutralinos are thought to be their own anti-particles, they could have been responsible for bizarre "dark stars" in the early Universe and for an ongoing gamma ray flux due to neutralino-neutralino annihilations. Moreover, most of the dark matter in the Universe would essentially be the cosmic neutralino background, an echo from the big bang analogous to the cosmic neutrino background and cosmic microwave background. There is already tantalizing experimental evidence that such dark matter particles exist, from multiple sources.
Additionally, recent evidence seems to hint strongly of the existence of strange-matter. Unlike ordinary quark-based matter such as protons, neutrons, other baryons and hadrons strange-matter would consist of up, down, and strange quarks in a more-or-less liquid like state instead of individual quarks being bound up in 2 or 3-quark composite particles. It is possible that some of the so-called "neutron-stars" out there are in fact strange-matter stars. It's also possible that such matter is stable in effectively arbitrarily sized fragments, down to individual tons, grams, or even atomic nuclei scales. Who knows what kind of post-nuclear physics we could perform if we got our hands on such particles. One possibility would be to use them to create energy via fusion. Tossing particles at a strangelet with sufficient speed should cause them to dissolve and fuse with the strangelet, releasing more energy in the process than conventional fusion.
And that hardly touches the surface. There are even more bizarre things out there like micro black holes, the matter vs anti-matter chirality, the possibility that superconductors serve as mirrors for gravitational waves, and so much more. There's plenty of star trek out there waiting for us.
For example, there is a certain speed where if you exceed it you are able to violate causal time relationships. I can't think of any experiments that would validate this. However, there is also the fact that theoretically, if you attempt to accelerate matter to the speed of light it's mass will increase infinitely. So if you accelerate it a little bit it's mass should increase a little bit, and you should be able to confirm the speed of light through an experiment where you measure infinitesimal increases in mass during large acceleration.
So my comment is that if he just broke the speed at which light travels, then everything is fine. But if he broke the speed at which you are able to violate causality, or the speed at which the mass of an object is infinite, then our entire understanding of physics is likely to be invalid.
Related reading - tachyon pistols
So you can't just use c for "the highest speed any particle can have in vacuum".
Non-zero rest mass photons will break a lot of theories.
Plus, the "speed of light" is not merely an experimental result coming out of an interferometer. It's also a theoretical result, e.g. from Maxwell's equations - that's the one referred to by special relativity.
There is a good description of what is going on in this Stack Exchange post:
It explains why saying "c is the speed of light" makes sense, because when we say light is traveling more "slowly" through a material, we are including the time spent interacting with the material, being absorbed and re-emitted.
I'm bristling a little at your statement that "the speed of light is not constant". Imagine two men walking at the same speed from A to B. But one of them is walking in a straight line, while the other is zig zagging. It would be fair to say that the one walking in a straight line is travelling from A to B faster, even though they are both moving at the same speed. The speed of light is a constant, it is just that light travelling through a medium doesn't necessarily spend all of its time travelling in one direction.
I think this is wrong. Regardless of the medium the speed of light is always constant. It seems to slow down because the photons are getting absorbed and re-transmited by atoms. But the speed of light is always the same regardless.
"Light traveling within a medium is no longer a disturbance solely of the electromagnetic field, but rather a disturbance of the field and the positions and velocities of the charged particles (electrons) within the material. The motion of the electrons is determined by the field (due to the Lorentz force) but the field is determined by the positions and velocities of the electrons (due to Gauss' law and Ampere's law). The behavior of a disturbance of this combined electromagnetic-charge density field (i.e. light) is still determined by Maxwell's equations, but the solutions are complicated due to the intimate link between the medium and the field.
Understanding the behavior of light in a material is simplified by limiting the types of disturbances studied to sinusoidal functions of time. For these types of disturbances Maxwell's equations transform into algebraic equations and are easily solved. These special disturbances propagate through a material at a speed slower than c called the phase velocity."
As another commented pointed out, you're splitting hairs. The "speed" of light and how fast it is "moving" depends on how you define those terms. The second paragraph of the wikipedia speed of light article has it right "The speed at which light propagates through transparent materials" - which does change.
Here is a relevant piece:
When light enters a material, photons are absorbed by the atoms in that material, increasing the energy of the atom. The atom will then lose energy after some tiny fraction of time, emitting a photon in the process. This photon, which is identical to the first, travels at the speed of light until it is absorbed by another atom and the process repeats. The delay between the time that the atom absorbs the photon and the excited atom releases as photon causes it to appear that light is slowing down.
If the photon is traveling less D over the same amount of T, I am ok with saying the velocity is lower, and it has slowed down.
Nobody would say that they slowed down if they increased their speed as they went through a turn.
And as for the "c" in e=mc^2, doesn't this suddenly make "c" an unknown constant? Doesn't the fact that "c" changes suddenly change the values of the other variables in that equation as well? That seems pretty fundamental to me...
So if there's something faster, it changes our understanding of photons but not the existence of this fundamental maximum speed.
As you note, our efforts to measure c may have been off due to measuring the wrong thing, but I don't know the ramifications of a small % change in c.
(I'm not a physicist)
(Another basic assumption, this time for general relativity, is the equality of inertial and gravitational mass, which is not a self-evident thing. However, so far no difference has been found. (see http://en.wikipedia.org/wiki/E%C3%B6tv%C3%B6s_experiment)
I don't know whether changing c by this amount would break many experimental results. Adding a rest mass to photos sounds potentially revolutionary.
Photons speed up and slow down routinely, depending on what medium they're traveling through. c, as it is used in the equations of relativity, is currently believed to be equal to the speed of light in a vacuum. But, with my limited knowledge of GR, my understanding is that gaika is correct and that the rest of the theory can still stand if this equality is broken.
A photon's instantaneous speed is always the speed of light.
Do they really? As far as I know their speed is always constant in any medium. They just seem to slow down because they get absorbed and re-transmited. That is where the lost of velocity comes from. When traveling between one atom and another, which is a vacuum, they are always traveling at the speed of light.
The post in question http://news.ycombinator.com/item?id=2943950 .
(I agree with your sentiment, however.)
As per Wikipedia ( http://en.wikipedia.org/wiki/Faster-than-light
"In special relativity, while it is impossible in an inertial frame to accelerate an object to the speed of light, or for a massive object to move at the speed of light, it is not impossible for an object to exist which always moves faster than light."
Print out 90% of physics articles from the century.
Get big trash can.
Put printout in trash can.
Of course i'm exaggerating, but if the story is true, this is going to be big.
Speed of light is the main block for things like time travel and a hard limit on communication speed.
It doesn't matter that it is only 0.0025%, what matters is that it can be done(if it is confirmed). As a(very poor) analogy, the first CPUs were probably about 0.0025% the speed of the modern ones, yet look where we are today, in no small part thanks to them. The point is, if true, this may well open a whole new realm in physics, and who knows what we will find there.
what i am most excited for is the possibility to travel faster than the speed of light. conventional theory dictated that the more mass you have the more fuel you would need to break the lightspeed barrier, however the closer you approach the speed of light the more fuel you would need therefore increasing your mass infinitely putting you in a perpetual null loop.
but if that equation changes and we know that there are particles that travel faster than light speed limit we may have to re-examine this theorem. Especially with new power sources being discovered on the atomic and quantum levels. The splitting of those bonds if harnessed yield promising potential. Not to mention the existence of antimatter which releases catastrophic amounts of energy when in contact with matter. We can't seem to find any right now so were limited with how much we can make which is a miniscule amount. Limits the production seem to be on a physics level as opposed to a technological one. But who knows if the speed of light is up for discussion almost anything can be in a table.
All this reminds me of the beginning of Mostly Harmless: "One of the problems has to do with the speed of light and the difficulties involved in trying to exceed it. You can't. Nothing travels faster than the speed of light with the possible exception of bad news, which obeys its own special laws. The Hingefreel people of Arkintoofle Minor did try to build spaceships that were powered by bad news but they didn't work particularly well and were so extremely unwelcome whenever they arrived anywhere that there wasn't really any point in being there."
I doubt the experiment will be repeated, but it sure would be awesome if it broke Einstein.
Using ion propulsion and gravitationally assisted slingshot trajectories, you'd get to Proxima Centauri about 173 days 12 hours sooner, which sounds pretty cool if you ignore that that's 173 days of a 19,000 year trip.
By nuclear pulse propulsion (EDIT: invented, but still theoretical, thanks adrianN) taking 85 years, you'd get there about 18 hours sooner.
More likely, it's experimental error.
.0025% is meaningless compared to the scale of the earth, but over hundreds of light years it's huge.
If we can send information even a little to the past, we can then send it a little more to the past, and by induction we can send information anywhere in time.
So we have infinite power computers. Singularity starts here.
edit: OK, this blows my mind (from https://secure.wikimedia.org/wikipedia/en/wiki/Tachyon#Speed):
It has been argued that we can avoid the notion of tachyons traveling into the past using the Feinberg reinterpretation principle which states that a negative-energy tachyon sent back in time in an attempt to challenge forward temporal causality can always be reinterpreted as a positive-energy tachyon traveling forward in time. This is because observers cannot distinguish between the emission and absorption of tachyons. For a tachyon, there is no distinction between the processes of emission and absorption, because there always exists a sub-light speed reference frame shift that alters the temporal direction of the tachyon's world-line, which is not true for bradyons or luxons. The attempt to detect a tachyon from the future (and challenge forward causality) can actually create the same tachyon and sends it forward in time (which is itself a causal event).
Yet, we see news like "Kardashian hubby's bad first impression" on the first page of sites like Yahoo "news", disappointing...
Almost certainly this is a measurement error or some other mistake. If a couple of years from now nobody found an error and a bunch of independent groups reproduced the results, then it's time for capital news!
I am impressed that their measuring instruments have a precision that is (statistically significant-ly) smaller than a billionth of a second.
Now, 10Gbit probably does operate with a period less than 1ns. (1 billionth of a second)
Thats a whole lot of time delay, not a "tiny fraction" as the reports say. Light travels about 18m (~60 feet) in that time. A modern cpu will have processed more than 100 instructions in that time interval. So you don't actually need a CERN quality clock to measure it.
So if this turns out to be due to systematic errors of various kinds, I'm wondering what other measurements from the lab will be cast into doubt as a consequence!
Exciting result and possibly exciting times ahead for physics.
"The variable speed of light (VSL) concept states that the speed of light in a vacuum, usually denoted by c, may not be constant in most cases."
"In 1998, Magueijo teamed with Andreas Albrecht to work on the varying speed of light (VSL) theory of cosmology, which proposes that the speed of light was much higher in the early universe, of 60 orders of magnitude faster than its present value."
is it really crazier than nothing being able to travel faster than light? it sure as hell isn't crazier than particles flitting into and out of existence at the subatomic level all the time. the fact is, that most things science has uncovered, particularly in the last 100 years, have been nothing short of hysterical. and for this, dear researchers, i thank you.
They pass through matter as it wasn't there AND could be faster than light?
Aliens must have been stupid to use radio waves.
The idea of cosmological redshifts would need revision, and thus our understanding of the universe. But at least c remains unassailable...
Edit: other than the obvious reason of "linkbait", I mean.
Or maybe he was just thinking of ludicrous speed.
It's always disheartening when science gets reported to the media at such an early age of discovery (i.e. the point where it hasn't been criticised and the lead investigator himself says he isn't so sure). It just opens the flood gates for the retarded news to make of moronic headlines like: "Roll over Einstein: Law of physics challenged"
Easily the best biography I have ever read. If you don't like biographys, this will change your mind.
(5 nanoseconds) / ((732 km) / c) = 0.000204776269 percent
(60 nanoseconds) / ((732 km) / c) = 0.00245731523 percent
((60 nanoseconds) / ((732 km) / c)) * c = 16 479.1646 miles per hour
the speed of light in a vacuum is a known quantity, it isn't changing. this is about relativity being void
For example: when muon-neutrinos transform into tau-neutrinos, in that moment they could "tunnel" through space-time in some fashion that appears as faster-than-light travel. Or something happens to their probability waves to make them go from existing at point A, to point B.
edit: More questions, since photons are massless they should be unaffected by gravity except in the sense of following the curvature of spacetime. Could the non-zero mass of the neutrino mean that if you changed the experiment so it fired away from the earth mean that the neutrinos would travel "slower" than photons?
Understatement of the century.
Besides scientific calculation of speed is based on the assumption that the velocities of transmission of all the colors are the same. Namely, if we call "W" length of the wave of any given color in ether, and "V" the velocity of transmission of that color, and "N" number of vibrations or waves of that color per unit of time, then the formula connecting these is W = V/N. But in order to calculate the rates of vibration Science assumes that the various Vs of the different colors are all equal to one another. This assumption is false and it has been proven that the velocity of red and of blue light are different.This is an important phase in human History-If this is true then the age of the Stars has Just Began.