Hacker News new | comments | ask | show | jobs | submit login
Speed-of-light experiments yield baffling result at LHC (bbc.co.uk)
938 points by pmjoyce on Sept 22, 2011 | hide | past | web | favorite | 239 comments



A news article where both the scientists and the reporter understate the claims, publish the data, and ask other teams to please prove them wrong.

I like this. Most of the time the reporters overstate the research, the scientists keep the data secret, and the general public is left scratching their heads.


This seems to be solid work. First, the group proved that the neutrino speed can be accurately measured in OPERA:

http://arxiv.org/abs/1106.2822 http://iopscience.iop.org/1367-2630/13/5/053051

Second, they compared the neutrino results with light results. Third, they ask to replicate the results in some other environment. This is a normal scientific procedure.

Please also note the number and variety of authors on this work. They will make sure that there is no overstate, understate, secrets, or head scratching.


Phil Plait the Bad Astronomer goes into some of the technical challenges the scientists have to overcome in this experiment in laymen's terms.

http://blogs.discovermagazine.com/badastronomy/2011/09/22/fa...


Wow, and he's extremely reserved about the whole thing - usually he's like a cold shower, raining on the press's parade. I like the SN1987A counter-example, though - it does raise a few questions. Not even remotely conclusive, but it makes it more puzzling.


Well normally the press or the PR people representing the scientists blow things all out of proportion. In this case it seems everyone is being pretty reserved and the claims, while extraordinary are already heavily couched in caveats.

I am expecting one of two things to happen:

a. A systemic error in the measurements, which is most likely IMO. b. a modification of physics similar to how relativity was a modification of Newtonian physics.


Scientists generally do not keep data secret. They try to publish as much as they can actually! This is how you get nice things like funding and tenure!

Most of the time reporters misunderstand the research and then oversell it in order to get their story read. The general public is usually left scratching their heads because of this bizarre layer of clueless journalists between them and the scientists. Yes, some scientists are hard to understand, but there's no hope in trying to understand particle physics filtered through the soft spongy grey matter of a liberal arts major! Scientists are quite often educators who are used to speaking to a wide range of audiences. Journalists would do far far better to introduce the subject and then let the scientist explain it in their own words. I have no idea why this is not done more.

As for the FTL neutrinos at CERN... I'd hold on to skepticism for now. The reason they're asking for help finding the glitch instead of proudly proclaiming that they've found tachyons (FTL particles) is because this would be a truly bizarre result that would upend a century of physics that has, to date, proved remarkably accurate in its predictions. If this discovery proved true it would literally force us to rewrite physics from ~1900 onwards. Physicists are not afraid of this. Quite the contrary! It would touch off a bonanza of research! There would be so much work to do that the field would literally explode. It's just a lot more probable that there really is a glitch somewhere and we simply haven't found it yet.

This actually would not be the first time that tachyons were "found". A group in the 70's found statistically significant evidence for them preceding cosmic showers. They were so sure of their results that they published. Then, a year or two later, they discovered that it was just a glitch in their electronics! These guys were small fry compared to CERN, but CERN's accelerator is so complex that glitches could be very hard to find. The scientists at CERN are clearly out of ideas and mean just what they say: They don't believe their results and they want help finding the glitch that's probably causing them. It's extremely irresponsible to report this story as if FTL particles have really been found before CERN has had a chance to consult with outside scientists and other facilities have had a chance to try reproducing their results.


Scientists generally do not keep data secret. They try to publish as much as they can actually!

Publish != tell the media. Publish != share raw data.


When we publish papers, we usually include sample raw data as part of an online supplement or appendix whenever the journal permits it (most people never look at these). Not all journals do permit this, and storage constraints sometimes make it difficult to provide a worthwhile sample anyways. We do supply raw data to anyone who asks for it. If that counts as keeping our data secret, then so be it.

I have yet to find encounter a group in my field that refused to share raw data when asked for it. They're usually happy to have another set of eyes look for a meaning they didn't publish (This could lead to another paper they might not have to work very hard for!). Again, is it really secret if all you have to do is ask for it? Corresponding author email addresses are on papers for a reason!

I will say that it is unusual for a group of scientists as prominent as the ones at CERN are to go to the media before publishing, but they are clearly very unsure of their results and would rather have a lot of eyes looking for bugs before they commit to publishing their results. This, of course, won't accomplish much if they try to keep secrets, as you are suggesting.

In fact, here's a challenge for you. Go ask CERN for their FTL neutrino data. I bet they'll be happy to give you a copy. They'll probably even have a grad-student to spare to explain some of it to you, although his or her time will probably be at a premium for the next few weeks!


That's a fair point, and my comment was a bit lazy. All I wanted to say was that publishing findings, publicising findings, and sharing the raw data aren't exactly the same thing and don't necessarily go together.


Yes,the mankind would literally need to rewrite physics! It will be so exciting to be a part of it!

About the 70's group, can you share the link to the article?


am i the only one that has come to love the scientific method simply because it creates more questions than it answers. there is just so much that we do not and cannot understand. the more we learn, the less we know. fantastic.

on the flipside, i would be certain that certainty is a particular kind of hell, if having such knowledge didn't condemn me to it. excuse me while i proceed to consume my tail http://en.wikipedia.org/wiki/Ouroboros


+1 here! Questions are fun. I didn't learn my name till people around me bugged me thousands of times a day with the questions -- 'what is your name, you cute little thing?'.

I so so digg your 'the more we learn, the less we know' dialogue.


The Scientific Method could be boiled down to:

10: Question Reality

20: Verify Results

30: GOTO 10


let's try without GOTO:

    while anyone_cares:
        improve_model()
    
        if theory_of_everything_appears or strong_ai_appears:
            raise Exception("rethink loop")


10: Question (the currently accepted model of) Reality

20: Verify Results

30: GOTO 10


Line numbers. Awww yeah!


This is generally the reason in the esoteric studies one does not question a master.

In scientific studies, a question is made and an answer is given. That answer then leads to newer questions further probing the ultimate. One can approach, but never completely know.

In the Occult, one accepts a maxim or answer as an answer. Questioning hinders or derails to what is desired. I am speaking of the ultimate goal of the study of the esoteric: understanding of self.

(You mentioned an occult icon, thereby opening this line of commentary)


http://www.nature.com/nature/journal/v248/n5443/abs/248028a0...

I forget the source, but they later tested their setup with random input and found it predicted tachyons from that as well.


Most of the time ... the scientists keep the data secret

Um, no. There are plenty of data being released, and in even more cases the scientists are happy to give you the data if you ask. It's very dependent on discipline, though. In astronomy, pretty much all data obtained with publicly funded facilities (e.g. including all large space instruments) are accessible by anyone after a proprietary period of 1 year.


There is also releases and updates on genomic data. It stays pretty current as far as I know: http://en.wikipedia.org/wiki/International_HapMap_Project


Maybe not secret then, but to be released at a later date. Or released in one of those journals that most of cannot access.


No one releases data in journals, at least not in astronomy. There are dedicated sites for this, e.g. http://archive.stsci.edu/

About the later date: I think it's reasonable that those that put together the proposal and thought up the project have a chance at getting their results out first. Otherwise, why would you go through the effort of planning and executing observations if you could just poach someone else's data as it comes down the pipe? Think of it as a 1-year patent on the research project.


The data cite people http://datacite.org/ are trying to figure out how to create "data papers" and "data journals" that would let publishing data get a "peer-review" status that would help people advance towards tenure.


Yeah, this would probably be an important incentive.


There's often cases of the data literally being measured in tera or peta bytes. Releasing the raw data isn't always feasible.


The mother of one of my friends works with the Zooniverse[1] project, a citizen science effort. She was saying that one of their biggest problems is how to distribute their data to the public. I naïvely suggested BitTorrent, figuring the sets were in the 10s of gigabytes at most. She laughed and said that that wasn't a great way to distribute terabytes of data every few weeks.

[1]: http://www.zooniverse.org/


I also assumed that the storage/bandwidth/computational costs of particle accelerator data sets would make open access practically useless


Most journals can be accessed by those working professionally in the field.

Some argue this isn't good enough (and I have some sympathy for those arguments). But it's very different to a deliberate action to hide or obscure information.


I can see why people might not like this, but personally I would tend to think this is actually a good thing. Just imagine if climate deniers or intelligent design proponents had open access to every scientific paper they might want - imagine the howling din of useless distraction that would result. More generally, and less politically charged, I think it cuts down on the amount of aimless speculating and ill informed inquiry that an actual expert has to deal with.

That said, it's very true that this is worlds different from maliciously withholding data.


>Or released in one of those journals that most of cannot access

Go down to your local university library. Go over to one of the computers (or hop on the wifi). Read the article. Email it to yourself. Not that hard now was it?

This is not ideal, but it's not that you _cannot_ access it, it's just not convenient.


uh.. and what if you're in India? uhh .. apply for a Visa at your local consulate, hop onto a plane, go down to the library ... not very convenient, innit? I'm sorry, but academic data walled up in closed journals (im sorry you live in a coccoon where you think everyone has a local university library) is malicious withholding of information - i dont imply that malice exists on the part of the scientist publication, but of the larger system - smug statements that data available only to experts will save the 'wise ones' from useless questions and let them do work or so, are, i think, retarded. any-how, just my personal opinion.


Interlibrary Loan. There are places in the world that don't have internet either you know.


Indeed. Compare that with the other article about this currently on the first page of HN: http://hosted.ap.org/dynamic/stories/E/EU_BREAKING_LIGHT_SPE...

Which is less cautious and more of the standard fare of not-very-well-reported science.

sigh


At least they didn't use the phrase "perverse and often baffling" in this one.


You've got to admit that, if confirmed, these results would raise new and troubling questions...


Haha, I can't be the only one that got that.


Ah, Gladwell.


That's the article that's at the top of reddit right now. I came to Hacker News to see if there was a better article here since the reddit one didn't seem particularly well written.



Scientist rarely keep the data secret. It's the basis of peer review. There's a reason why open access journals (PLoS One) are getting so popular, scientists want to be known and publish. In my experience, ego is a currency much more valuable than money in academic circles.


In Physics, published raw data is not the basis of peer review. We're not there yet.


> Scientist rarely keep the data secret. It's the basis of peer review.

In a better world, that would be true.

In this world, it's not.

Most of the climate science data that Phil Jones and others have worked with, and most of their models are secret.

It is THIS that makes me keep at least one ear open for the "anti-" side in the climate debate. Real science SHOULD be conducted out in the open.


IPCC AR4 models all listed with lots of information and links (although some now dead): http://www-pcmdi.llnl.gov/ipcc/model_documentation/ipcc_mode...

For example, the three models developed by NASA: http://data.giss.nasa.gov/modelE/ar4/

Direct links to each model + code + datasets:

http://www.giss.nasa.gov/tools/modelE/ - with nightly code snapshots here: http://simplex.giss.nasa.gov/snapshots/

http://aom.giss.nasa.gov/ - with direct access to code here: http://aom.giss.nasa.gov/code.html and here: http://aom.giss.nasa.gov/code4x3.html

http://www.giss.nasa.gov/tools/modelii/ - seemingly work on this has now been merged with the ModelE simulation.

Lots of other code and data is available if you take the time to find it ... and then considerably more time to understand it.


The models are secret... wait, seriously??


Short answer is no, long answer is sort of. The way in which data / code is released is a complex issue. However, if your willing to wait a reasonable amount of time AND pay for the costs involved you can get just about everything produced.

Still, in the larger context validating someone's data is basically worthless activity. It's far more important to go out and collect new data, build a new model, run a new simulation, and report those results than it is to try and shortcut the process. The data deluge is making it way to easy for people to validate bad science using the same bad data / model as someone else. Fundamental assumptions in statistics break down when you use the same 10 coin flips to substitute for the 10 million coin flips that cost more than your willing to pay for.

PS: The failure to collect significant quantity's of new data is a hallmark of fringe / bad science. If you really think global warming is a joke or you can be more healthy by eating coco puffs then go collect that data and do your analysis and see if someone can poke holes in your process.


Well the model is the thing that tells us what effect a rise in temperature/CO2 would actually have, no? Seems common sense to me that this should be open so that people can judge it. You could program a model to say anything regardless of the data.


When it comes to global worming you have two models one of which is what happens after a fraction of a second when you have various levels of CO2 in a column of air over in daylight or darkness. That's a well understood thermodynamics problem and rarely debated, but you will have little trouble finding out the specific details on this issue.

The larger model is based around how that the extra heat maintained gets distributed around the world and everyone knows this is far less accurate. The people building this are well aware it's highly limited but by trying to balance the errors in both direction they hope to create a good estimate. The problem is when you start tweaking the heat distribution equations it's easy to find errors in every direction because they are greatly simplified so it's easy for someone to introduce systematic errors by only correcting issues in one direction.

Which is why you provide a high level description but when you release your source code you open yourself up to a lot of fruitless debate. Sort of like someone saying based on your methodology the rocks are really 1/1,000,000th a year older which means your science is a joke and less valid than my consistent theory that god created the world 8 weeks ago. But, if you actually build a model from scratch under similar assumptions it's going to have similar outputs and error bars or look really biased.


After about 20 million requests for their data and models by people who were motivated by god to disprove climate science, they got a little less open.

I don't think they're "secret" i think they assume anyone they haven't heard of who is asking for the info is a troll.


And it's unfounded statements like this that contribute to ignorance and useless "debate." They do not keep their models "secret," it's just difficult for them to release homebrew software not written by software engineers to the general public and get much use from the exercise without a lot of handholding. Difficult != secret.


No. In this world, it is true that scientists rarely keep the data secret, more often than not, silly conspiracy theorist views notwithstanding.

Business-sponsored science is very often secret, but please don't attempt to paint, as the parent commenter disingenuously attempts to do, that scientists are the root of data secrecy.

There have been now six separate inquiries into the misguided, corporation driven, politically-motivated and frankly dumb charges that climate science data is faulty or is associated with some kind of scientific misconduct. All six separate inquiries indicated it was not:

http://en.wikipedia.org/wiki/Climatic_Research_Unit_email_co...

Moreover, scientific organizations are overwhelmingly making changes that makes their data even more accessible:

http://www.thefreelibrary.com/Data+without+the+doubts%3A+the...


I don't know about scienctists in general, but at least in climate science it is not true that scientists rarely keep the data secret. You can review untold number of cases of this on climateaudit.org and the justifications by the stonewallers are extremely poor. Given the high profile and sweeping consequences of their work, this attitude is giving science a black eye.


Given the high profile and sweeping consequences of their work, this attitude is giving science a black eye.

First of all, "climate science" != "science as a whole".

Second, if any scientific field ends up in the crosshairs of a well-funded politically motivated movement working against it at any cost, then it will become hostile towards skeptics and eventually start stonewalling them. This happened in biology re: critics of evolution (in the anti-evolution camps there are tons of complaints about censorship, refusal to address criticisms, provide data, share models and code, etc.), and whereas it's always possible to address criticisms, the other side is always willing to spend more money to poke new "holes" in the theory, and you'll just never win.

"Skeptics" like to portray the ultimate goal of science as if it's to continually try to poke holes in existing theories, because progress is made when theories are disproven. But that's not science. Science is when you mix a desire to poke holes in theories with a reasonable threshold of acceptance, a willingness to step back and say "Okay, we've spent a lot of energy coming up with arguments as to why this is wrong, and they've been addressed satisfactorily, so this is probably right, let's see if this takes us anywhere new rather than getting stuck arguing the point".

Climate skeptics lack the willingness, even in theory, to accept that enough is enough - I don't believe for one moment that there is, even in principle, any piece of evidence that would lessen the amount of criticism against AGW. I don't believe they would give in because we've already seen that no amount of evidence reduced the amount of anti-evolution noise, and it's by and large the same sources of funding and political motivation that's behind both movements.

Now, it very well may be the case that climate scientists also lack the willingness to admit that they're wrong (actually, I think this is pretty likely), in which case they're not doing science, either, and that's bad. I'm totally in agreement there.

But they're taking exactly the approach to the skeptics that they would (and arguably, should) be taking if they were doing proper science. So the way they deal with skeptics is not a point of evidence either for or against the honesty of their scientific investigations, it's merely an inevitable result of the politicization of their field.



> He is the proprietor of the Climate Audit blog (or, more accurately, "Climate Fraudit")

If you can't make an argument without resorting to name calling in the first line, please excuse me while I disregard anything at all you have to say. kthxbye.


Maybe they do that because of the number of mistakes and basic misunderstandings of the science involved? If you don't like the name calling just look at the actions of the man, as outlined in the article. He is not a credible commentator.


The vast majority of the world's petroleum geologists keep their data secret.


But are they "doing science" or just doing a job?


Presumably, they are in private employment?


In related news most gold prospectors keep their gold strikes secret until they lodge the claim....


Are they concerned with the possibility of data falsification?

This has happened in other fields.


Well, it is one of the most surprising results in the past century. It's not like this is an "x cures cancer/AIDS/the common cold/obesity" which reporters love to overstate, and which can easily be overstated because of how vague things like "obesity" and "cures" can be.

In this case, it's "this fact that we've believed to be completely fundamental to physics for the past century appears to be wrong," and if you stated that without asking for people to check your work carefully because you are puzzled too, people are going to laugh at you if it turns out that you were wrong somehow. I'm not sure how you really could report this without downplaying it; it's just too incredible a claim for people to believe if you did anything close to overstating it.


It's called impartial news, possible only as a public service


> impartial news

Inherently impossible, a troubling myth that is made all the more troubling by the fact people believe it.

Impartiality is gone the moment the camera is aimed over here as opposed to over there.


The beauty of state-funded news, no?


> The beauty of state-funded news, no?

Until it steps on the state's toes.


The BBC is indirectly funded, but not controlled by the UK government. It's possible that the government could threaten to cut the BBC's budget by reducing the cost of the TV license, or not raising it along with inflation, unless it changed its news coverage. However, a tactic like this would go over about as well as a proposition to repeal the first amendment would in the United States.


> The BBC is indirectly funded, but not controlled by the UK government.

I doubt funding can ever be divorced from control, even if it is indirect. The BBC knows where its next meal comes from.


So if you were the UK government, and you knew the BBC was going to publish something you didn't want them to, how exactly would you get them to stop?


I'm Australian, and we have the ABC. So, assuming you're British, we're coming from the same page.

That said, the GP does have a point. The only thing preventing the possibility of closed-door deals going on between the BBC and political parties is good faith. Ergo, if it's not already happening or hasn't already happened, which is unlikely, there's no reason it can't happen in future.

Note: despite Australia's high exposure to UK content, I don't know much about the British media landscape. I haven't read the BBC's charter (nor the ABC's, for that matter).

Edit:

It's a tradeoff of different goals. On one hand, you get a news agency which can focus on news - not entertainment as would be dictated by a comittment to profit. On the other hand, that news agency is more susceptible to decisions by government. Consider: what determined the BBC's spending cuts?

I think publicly-owned news organisations are a great idea. The ABC, BBC, and NPR usually provide great news coverage. They have no need to sensationalize content, and that services the public to no end.


> The only thing preventing the possibility of closed-door deals going on between the BBC and political parties is good faith. Ergo, if it's not already happening or hasn't already happened, which is unlikely, there's no reason it can't happen in future.

Political parties could equally make backroom deals with private media companies as well. It would also be considerably less risky, because they wouldn't have to violate the Royal Charter.

The problem with backroom deals concerning the license fee is that they have to be remain secret, yet affect how the news is reported without drawing suspicion. Any control would have to be very subtle.


Ensure a reduction in TV license fees, perhaps, or schedule a big, expensive inquiry into whether the BBC is systematically biased in its reporting. After all, the people pay for it so the people have a right to know, regardless of how many millions of pounds it costs the BBC.


Reducing the TV license fee at the same time the BBC is going to report something unfavourable to the current government might seem just a little suspicious, don't you think?

An government inquiry into the BBC's reporting would be even worse, as it would be a clear violation of the BBC's independence.


> Reducing the TV license fee at the same time the BBC is going to report something unfavourable to the current government might seem just a little suspicious, don't you think?

That's why you don't reduce it right away. You threaten to reduce it if they're impolitic enough to pursue certain courses. You make this fairly clear to them. The actual decisions come from the BBC's own leadership, acting on such knowledge.

> An government inquiry into the BBC's reporting would be even worse, as it would be a clear violation of the BBC's independence.

Freedom's just another word for nothing left to lose. The only independence the BBC can have under its funding model is if it decided to risk going downhill and no longer being able to do the good things it's doing now. A reduction in license fees would accomplish that, meaning if you can wave that stick around the BBC will waive its independence and allow investigations.


> You threaten to reduce it if they're impolitic enough to pursue certain courses. You make this fairly clear to them.

And the next day there's a headlining report on government corruption.

I'm not quite seeing how the government can threaten the BBC, when they have so much to lose if those threats become public.


If this result is valid then it would mean that neutrinos are capable of traveling .0025% faster than light. There was a supernova that we observed in 1987 (SN 1987) that occurred 168,000 light years away from us. A neutrino burst was observed in 3 different labs about 3 hours before the light was observed. If neutrinos are capable of traveling .0025% faster than the speed of light then why would we observe these neutrinos at a time that is consistent with them traveling at almost exactly the speed of light (slightly less due to their finite mass)? A difference of .0025% would correspond to the neutrinos arriving 4 years earlier! This is the first experimental contradiction to this result that pops into my head but there are probably many more.

Plain and simple, this is most likely due to a systematic error in their experiment that isn't being properly taken into account. The result would tear apart well established theories that have been tested time and time again in thousands of different ways. Of course that doesn't mean that they're absolutely right but it does mean that any contradictory result has to be initially taken with a grain of salt (kudos to the article writer for doing this). It's easy to mess up a calibration in such a complicated system and 60 nanosecond errors could potentially pop up. It will be interesting to see the results from other labs but I would advise against getting your hopes up for any new physics.


Um... is there any trace of neutrino bursts before that? Im not sure anyone bothered checking, since such a correlation would have been very hard to justify. A neutrino burst 4 years before a supernova... so what?

Could a physicist clarify whether neutrinos could travel at different speeds in vacuum like any 'normal' particle, or are 'fixed speed' like photons. We could well have had several waves with the bulk arriving at c-epsilon and no one bothering to check outside the expected time frame.


Well, the point is that there was a neutrino burst coincident with SN87a. So if there was one 4yr earlier, too, then you'd have to explain how the same event produced two bursts of neutrinos with different speeds, while within the burst they apparently had exactly the same speed.

If neutrinos have mass, they can travel at any speed <c depending on their energy, like any other massive particle. Of course, the upper mass limit is quite low, so even with very little energy they will go very fast.

(Edit: In fact, the arrival interval of those neutrinos puts an upper limit on the neutrino mass, because the larger the mass, the more different the speed and the more dispersed the arrival of them at Earth.)


Maybe the neutrinos that appeared 3 hours before the event are from another event altogether (one that occurred even further away, whose light particles haven't even reached us yet)?


Given that the flux goes as 1/r^2, and the sum total of the neutrinos detected in association with 87a was something like 17, a SN in any external galaxy would be undetectable.


I'm no physicist, so I am just guessing.

It could be some phenomenon similar to what we see in emission spectrums, where a pure source can emit photons at multiple precise energy levels. except that in the case of particles with mass(apparently neutrinos have a mass) this would result in a difference in speed. Not a big enough difference to be noticed on 700 km , but enough to make a 4 yr difference on 170kly.

Of course this is still wild speculation without much background.


Neutrinos released in type II supernova explosions are thermal, so they should have a broad spectrum unless pretty much everything we think we know about SNe is wrong.


I don't think the claimed potential result is that neutrinos always travel exactly 0.0025% faster than c, but that they can - perhaps under very specific and special circumstances. I agree with your skepticism and think we should approach this slowly, but your I don't think that your particular example disproves or contradicts the claim.


The result would tear apart well established theories that have been tested time and time again in thousands of different ways.

That's what makes this so exciting!


No, I'll wait until it's replicated in another lab before I get excited.

Failing that, I'll at least wait three years for the "Ohhh yeah, so we checked again and it turns out that [simple explanation X]".

On the offchance that this is a startlingly new breakthrough in physics, I don't lose much by waiting three more years before I believe it.


If you're not looking for something, it's hard to find it.

If you assume nothing travels faster than light and you have a burst of neutrinos a couple of years before an event, you'll look for other explanations or maybe simply ignore the readings. Generally they expect the neutrinos burst a couple of hours before the photons arrive.

Now it's suspected that a flavor of neutrinos can travel faster than light, maybe there's going to be some data digging to find correlations...

The universe if full of mysteries and it's highly likely Einstein's theory although verified and correct in many cases is incomplete not to say totally incorrect in other cases, as Newton's theory was.

Let's not assume anything.


You forget one crucial thing in the experiment. A neutrino reacts with almost nothing. A photon on the other hand, react with almost any damn thing you place around its path - metal? Yes. Mirror? Yes. Wood? Yes. Air? Yes!

I very much doubt they completely vacuumed the path of the neutrino to the same degree as it is in outer space. It is possible that neutrinos travel at the same speed as light in vacuum, but travel much faster when there's other material present in the way.


The speed they got in this experiment was faster than light would travel in a vacuum, according to the abstract.


If the nutrinos were from the same supernova, doesn't the fact that they arrived BEFORE the light still mean they traveled faster than light? Whether it should've been 4 years instead of 3 hours seems besides the point as far as the significance of the recent observation is concerned. Unless I'm missing something here.


The faster-than-light neutrinos got slowed down on their way to us by other faster-than-light neutrinos they encountered on the way?


those neutrino's could be from anywhere in the universe. we measured them 3 hours before the light of the supernova because we were looking for them at that time


Right, because those scientists were psychic? Neutrino observatories look for neutrinos continuously. They do not throw away data, if there are neutrinos detected they will be logged. Never in the long history of neutrino observation has there been a spike in detections similar to what was observed with SN 1987a. Moreover, that observation heavily confirms our pre-existing models of neutrino generation and behavior in Type II supernovae.


There was an experiment where an impulse of light came out the other end of a material faster than light could have traveled through vacuum in the same space. The eventual explanation was as follows - imagine that your outgoing bunch of photons looks like this: :::... The first three photons are "invisible" because there is so few of them that they are below the equipment sensitivity. The second group of six particles is more dense and so they are visible to equipment. Hence, it is deemed that the light has "entered" the material when the second group has entered, long past after the first one actually did. Having entered the material the first group has triggered release of energy from the material, and the second group was partially consumed by the material (energizing it for the next time around). The outgoing bunch looked like this: ...::: And this time it was the first group of photons that was detected. So the apparent speed of the beam was higher than the speed of light. However that's not because the same particles traveled faster than light, but because the peak energy of the entire bunch has shifted forward during travel. If you try hard enough, the light will have "exited" the material before it has "entered".

Similar thing could be happening here.


That is such a well known result in physics that the likelihood of it occurring here is almost zero. I remember first reading about it and the scientists basically said "we broke the speed of light" without looking for further explanations. These guys seem like they generally don't believe that they have broken the light barrier, but they need help figuring out what they actually did do from the broader community. Due to their very humble attitude, I bet they examined the literature and ruled this particular possibility out very early on.

I bet you there is a systematic error, but I don't think it's this one.

My general rule with physics is that if I can think of it, then a real physicist will laugh at it's triviality.


What DenisM is describing is not the result of a systemic error. It is not an error at all. 'Something' is travelling faster than light in that experiment. The problem is that the 'something' isn't a physical object.

The common example is the beam of a lighthouse. Suppose a lighthouse revolves once per minute. At one lightyear distance from the lighthouse, the angular velocity of the beam is 2*Pi lightyears per minute, which is much faster than the speed of light. However, this is not a problem: the beam is not a physical entity. There is no single particle actually travelling faster than the speed of light. It's just some construct in our minds to which we assign that velocity.

Something similar is happening in the experiment described by your parent and possibly in the experiment in the linked article.


Perhaps I misinterpreted, but I think DenisM was talking about something completely different. He mentioned that because of a lack of density in the leading photons, the first piece of equipment missed the measurements while the second one picked it up. That would a problem with the system they're measuring with, not at all to do with treating light as a single entity.


As I interpreted it, the critical part of his explanation is the fact that the higher density part shifted from the back to the front. If that change in density would be due to actual photons shifting that way, then those photons would have traveled faster than light (whereas the entire 'blob', on average, traveled exactly at the speed of light).


Could be explained if the photos leaving the source look like this: :::::..... and arrive like this: :::::.::, where the 'tip' folds back on itself to become visible, still before the original bulk of the photon cluster.


At one lightyear distance from the lighthouse, the angular velocity of the beam is 2Pi lightyears per minute*

Actually, that's the linear velocity. The angular velocity would be 2*pi radians per minute and is independent of distance.


I have used the word "Similar" in the most loose meaning of this word, merely to suggest to the audience that "faster than light thing has been observed" does not have to mean that the existing model is broken.


Phase velocity vs group velocity.

But it's pretty easy to check that everything is alright with group velocity. If there is zero neutrino production at t=0 (in the reactor's frame of reference), and then the production jumps up, there can be no doubt about the group velocity.

Bizarre effects relative to superluminal phase velocity only appear when you start generating particles before t=0 and you mess with the medium in between. Then the "superluminal" illusion is created by quanta created before t=0.

TLDR: I find it bizarre that seasoned professionals could be fooled by phase velocity or a similar effect.


Phase velocity vs group velocity.

Yes, thank you. Why did it take me so many words? :)


I'm glad that it did take you so many words. I remember reading about this a while back, but could never remember what it was called, or why the light particles actually didn't break the speed barrier. I really appreciate your laymen explanation, and now, thanks to you parent, also know what the effect is called.


Think of the wagon wheels, as seen in a western movie, turning back, or moving at bizarre speeds in either direction. It's not a perfect analogy, but it's a related phenomenon.


That's called aliasing.


Well, you gave the layman's explanation. Not everyone is well versed in these specific terms.


I'm not familiar with that experiment, but it doesn't sound to me like information traveled faster than the speed of light, which is really the important part of the speed of "light."


The fun thing is some physicists manipulated group velocities to create time lenses


Similar results seem to have already been obtained in the past

http://en.wikipedia.org/wiki/Neutrino#Speed

QUOTE

In the early 1980s, first measurements of neutrino speed were done using pulsed pion beams (produced by pulsed proton beams hitting a target). The pions decayed producing neutrinos, and the neutrino interactions observed within a time window in a detector at a distance were consistent with the speed of light. This measurement has been repeated using the MINOS detectors, which found the speed of 3 GeV neutrinos to be 1.000051(29) c. While the central value is higher than the speed of light, the uncertainty is great enough that it is very likely that the true velocity is not greater than the speed of light. This measurement set an upper bound on the mass of the muon neutrino of 50 MeV at 99% confidence.

/QUOTE

The value looks awfully like what we have in front of us today, but the uncertainty was too big to investigate further

EDIT : actual paper :http://arxiv.org/abs/0706.0437


Experimental error seems likely to me once again. 10ns at c is 3 meters, over 730km + uncertainty in time of particle generation and decay + detector uncertainty...I don't buy it. From the project site: http://operaweb.lngs.infn.it/spip.php?rubrique41

OPERA needs an intense and energetic beam of muon neutrinos traveling a distance of hundreds of kilometers to seek for the appearance of oscillated tau neutrinos. A beam of this type is generated from collisions of accelerated protons with a graphite target after focussing the particles produced (pions and kaons in particular) in the desired direction. The products of their decays, muons and neutrinos, continue to travel in generally the same direction as the parent particle. Muon neutrinos produced in this way at CERN cross the earth crust reaching OPERA after a 732 km journey.


>I don't buy it

I didn't mean to try to 'sell' anything, though I agree that my comment was needlessly biased here. Revised it accordingly.


On a side note, it's pretty wild seeing history being written in Wikipedia. The BBC site has just been referenced on that part of the Neutrino article.


> ... found the speed of 3 GeV neutrinos to be 1.000051(29) c.

What does the "(29)" in "1.000051(29) c" mean?


I'm not sure, but I think it represents uncertainty over the last digits.

So 1.000051(29) would be equivalent to 1.000051 +- 0.000029.


Yes, that is what it means. :)


That can't be it because the error wouldn't straddle the speed of light. It would put the range at 1.000022c to 1.000080c, both of which exceed c, so they wouldn't have written it off as error.


But that doesn't indicate a hard error bound; this is a statistical matter, after all. Most likely it's just an expected standard deviation.


It indeed is the standard deviation.


A single standard deviation is not enough, normal procedure is to allow up to three standard deviations (or more, sometimes five), which puts the measurement in a range consistent with models.


That means that the last two digits of the number 1.00005129 are uncertain.

For example, they measured 1.00005129c, but maybe their uncertainty is +-0.0000001c. The last digits are unreliable, but are the measurements they got

EDIT: below, its claimed that those digits ARE the uncertainty. That quite possibly is it; my answer is from what i remember from uncertainty in my 1st year Physics course.


Those figures are being provided because they were measured, but they are within the expected experimental error for the device.


Repeating. It's 1.000051292929292929...

edit: sorry if that's incorrect in this case. "1.000051(29)" is one of the formats of indicating repeating decimals, per http://en.wikipedia.org/wiki/Repeating_decimal#Notation



This article (http://www.google.com/hostednews/ap/article/ALeqM5in1T5nvGNc...) mentions previous results and that they were thought to be due to measurement error, but claims that these readings had a speed of c + 60 nS, with a measurement margin of error of +-10 nS.


I desperately want to believe that this will somehow enable us to live in a Star-Trek future, even though I'm sure it won't.

Would someone with some knowledge of physics care to break down the ramifications of this (if it's not some sort of measurement error)?

Please?


It would mean that the theory of relativity is not 100% correct (in much the same way as Newtonian physics are not 100% correct). It wouldn't have any direct impact on anything (because relativity still makes correct predictions the vast majority of the time), but it would create the opportunity for new theories which can explain everything that relativity does plus this new experiment to be created. That new theory could have significant ramifications, or it could be a minor adjustment to current theory that changes nothing.


In any case, I'd like to submit my resume to the Time Travel Bureau. I am a computer collector and I am very familiar with many of the technologies time travelers from the future may want to procure. ;-)


Sorry, your future self was faster and already got the job. He also claimed to have a larger accumulated experience :-)


Why would they procure them from you rather than go get a fresh new one from the day it was made?


I'm not selling the devices - I'm selling the expertise ;-) How do you know the best time and place to buy, say, a Mattel Aquarius or an A-series Unisys mainframe on an ISA board?


Mr. Titor, is that you?


The Star Trek future, ie. travel with speed faster than light, will most likely happen through "warp drive", ie. using the space expansion(contraction) - this is how the natural faster than light movement does happen in the current Universe, and it doesn't violate Special Relativity.

Special Relativity holds only for space with static and continuous metric (to preview the complexity of the theory otherwise, try to play with Maxwell equations where integration would happen over non-static volumes/surfaces :) . The space expansion doesn't fit the premises of Special Relativity and thus there is no contradiction. Special Relativity is pretty artificial theory which applies to the true world of our expanding (ie. non static metric) Universe only by approximation - this approximation works only as much as the speed of expansion is slow and either time period under consideration is small or distances are small (but not very small as the space continuity premise of SR seems to get broken as we get to Planck distances)

In this particular case of neutrinos, i think, if they integrate Earth gravity time dilation variations along the neutrinos path, it would be correction on the scale around this billionth of second.


Correction, my bad: estimated time dilation variations integral along the neutrinos path is several orders of magnitude smaller than the billionth.


>i think, if they integrate Earth gravity time dilation variations along the neutrinos path.

Isnt that effect in the reference frame of the neutrinos and not the measuring instrument(I'm not too sure about this)?....And also wouldnt it mean that the neutrinos take more time instead of less time?


Haha, something like this isn't necessary to live in a star-trek future.

Consider what we know about the Universe but can't fully explain so far. We know that space-time is dominated by a "dark energy" that may be due to background quantum vibrations or some as yet completely unknown phenomenon. We know that most of the mass of the Universe is bound up in dark matter which at present seems most likely to be a weakly interacting massive particle, specifically neutralinos (a supersymmetric particle). Neutralinos are thought to be their own anti-particles, they could have been responsible for bizarre "dark stars" in the early Universe and for an ongoing gamma ray flux due to neutralino-neutralino annihilations. Moreover, most of the dark matter in the Universe would essentially be the cosmic neutralino background, an echo from the big bang analogous to the cosmic neutrino background and cosmic microwave background. There is already tantalizing experimental evidence that such dark matter particles exist, from multiple sources.

Additionally, recent evidence seems to hint strongly of the existence of strange-matter. Unlike ordinary quark-based matter such as protons, neutrons, other baryons and hadrons strange-matter would consist of up, down, and strange quarks in a more-or-less liquid like state instead of individual quarks being bound up in 2 or 3-quark composite particles. It is possible that some of the so-called "neutron-stars" out there are in fact strange-matter stars. It's also possible that such matter is stable in effectively arbitrarily sized fragments, down to individual tons, grams, or even atomic nuclei scales. Who knows what kind of post-nuclear physics we could perform if we got our hands on such particles. One possibility would be to use them to create energy via fusion. Tossing particles at a strangelet with sufficient speed should cause them to dissolve and fuse with the strangelet, releasing more energy in the process than conventional fusion.

And that hardly touches the surface. There are even more bizarre things out there like micro black holes, the matter vs anti-matter chirality, the possibility that superconductors serve as mirrors for gravitational waves, and so much more. There's plenty of star trek out there waiting for us.


It wouldn't break current theory, it would just mean that photons travel slower than "speed of light" and have non-zero rest mass. Constant c in relativity instead of speed of photons would just mean fastest speed possible.


I have a comment about this. C, as is used in general relativity, is involved in a lot more than just the speed that light travels. It is also relevant to a lot of other equations, like time dilation in a gravitational field. Now, if this experiment resulted in changing our concept of c as the "speed of light" to the "speed of neutrinos", then your probably right. But I have to imagine that c has been verified experimentally in non-light related experiments.

For example, there is a certain speed where if you exceed it you are able to violate causal time relationships. I can't think of any experiments that would validate this. However, there is also the fact that theoretically, if you attempt to accelerate matter to the speed of light it's mass will increase infinitely. So if you accelerate it a little bit it's mass should increase a little bit, and you should be able to confirm the speed of light through an experiment where you measure infinitesimal increases in mass during large acceleration.

So my comment is that if he just broke the speed at which light travels, then everything is fine. But if he broke the speed at which you are able to violate causality, or the speed at which the mass of an object is infinite, then our entire understanding of physics is likely to be invalid.

Related reading - tachyon pistols

http://sheol.org/throopw/tachyon-pistols.html


Not really. The reason the idea of fixing c as the speed limit is that the number arises naturally as the speed of EM waves from Maxwell's equations. Briefly put, these these equations are valid in all frames, the Galilean speed addition rule has to be wrong and c should be standard in all inertial frames.

So you can't just use c for "the highest speed any particle can have in vacuum".


No.

Non-zero rest mass photons will break a lot of theories.

Plus, the "speed of light" is not merely an experimental result coming out of an interferometer. It's also a theoretical result, e.g. from Maxwell's equations - that's the one referred to by special relativity.


An alternative is that neutrinos and photons travel a different distance because there are extra dimensions that affect the two types of particles differently. These are the so-called space-time foam models.


Don't forget... c is the speed of light 'in a vacuum'. Light travels slower through gas, water or glass. Light has even been slowed down to walking speed in a laboratory. The speed of light is not constant. The speed of light in a vacuum is. We assume.


> Light travels slower through gas, water or glass.

There is a good description of what is going on in this Stack Exchange post:

http://physics.stackexchange.com/questions/13738/propagation...

It explains why saying "c is the speed of light" makes sense, because when we say light is traveling more "slowly" through a material, we are including the time spent interacting with the material, being absorbed and re-emitted.

I'm bristling a little at your statement that "the speed of light is not constant". Imagine two men walking at the same speed from A to B. But one of them is walking in a straight line, while the other is zig zagging. It would be fair to say that the one walking in a straight line is travelling from A to B faster, even though they are both moving at the same speed. The speed of light is a constant, it is just that light travelling through a medium doesn't necessarily spend all of its time travelling in one direction.


>>The speed of light is not constant

I think this is wrong. Regardless of the medium the speed of light is always constant. It seems to slow down because the photons are getting absorbed and re-transmited by atoms. But the speed of light is always the same regardless.


After much research... the concept is correct, although 'absorbed' and 'retransmitted' are not the right words to use. http://en.wikipedia.org/wiki/Slow_light

"Light traveling within a medium is no longer a disturbance solely of the electromagnetic field, but rather a disturbance of the field and the positions and velocities of the charged particles (electrons) within the material. The motion of the electrons is determined by the field (due to the Lorentz force) but the field is determined by the positions and velocities of the electrons (due to Gauss' law and Ampere's law). The behavior of a disturbance of this combined electromagnetic-charge density field (i.e. light) is still determined by Maxwell's equations, but the solutions are complicated due to the intimate link between the medium and the field. Understanding the behavior of light in a material is simplified by limiting the types of disturbances studied to sinusoidal functions of time. For these types of disturbances Maxwell's equations transform into algebraic equations and are easily solved. These special disturbances propagate through a material at a speed slower than c called the phase velocity."

As another commented pointed out, you're splitting hairs. The "speed" of light and how fast it is "moving" depends on how you define those terms. The second paragraph of the wikipedia speed of light article has it right "The speed at which light propagates through transparent materials" - which does change.


That's splitting hairs in a way that makes you deviate from standard usage of the term. Physicists say things like "the speed of light in water is lower than the speed of light in a vacuum".


But to a lay person that statement does not mean the same thing as for a physicists. To a lay person it sounds as if literally the photons slow down. And I bet that a lot of people repeat this statement thinking that light actually slows down.


No, it's correct. The light actually gets slowed down in material (anything else than a vacuum.) That has nothing to do with observation, it's an actual physical effect. The speed of light in a vacuum is constant and so is the speed of light in any particular pure material (like a pure gas.) It's just that those constant speeds are different.


I say you are incorrect. Read this: http://physlink.com/Education/AskExperts/ae509.cfm

Here is a relevant piece:

When light enters a material, photons are absorbed by the atoms in that material, increasing the energy of the atom. The atom will then lose energy after some tiny fraction of time, emitting a photon in the process. This photon, which is identical to the first, travels at the speed of light until it is absorbed by another atom and the process repeats. The delay between the time that the atom absorbs the photon and the excited atom releases as photon causes it to appear that light is slowing down.


> The delay between the time that the atom absorbs the photon and the excited atom releases as photon causes it to appear that light is slowing down.

If the photon is traveling less D over the same amount of T, I am ok with saying the velocity is lower, and it has slowed down.


"Well, actually, no, officer, I wasn't speeding. You see, while you clocked me at 90mph [c] between toll booths [atoms], once you factor in time at the booth, you'll see that I am actually driving much more slowly."


But equating decrease in velocity with "slowing down" would be confusing for most laypeople, at least.

Nobody would say that they slowed down if they increased their speed as they went through a turn.


"Material" is made of smaller things, which I think the GP is getting at. The actual photons that travel from electron to electron and such don't get slowed down; they effectively travel through a vacuum that is the tiny spaces inside molecules and atoms.


Isn't the speed at which photons travel, by definition, the speed of light?

And as for the "c" in e=mc^2, doesn't this suddenly make "c" an unknown constant? Doesn't the fact that "c" changes suddenly change the values of the other variables in that equation as well? That seems pretty fundamental to me...


As I understand it, Einstein's work rests on there being a fundamental maximum 'speed' and it seemed to him as though the speed of photons was that limit, so 'speed of light' became synonymous with this maximum. But it doesn't necessarily have to be so.

So if there's something faster, it changes our understanding of photons but not the existence of this fundamental maximum speed.

As you note, our efforts to measure c may have been off due to measuring the wrong thing, but I don't know the ramifications of a small % change in c.

(I'm not a physicist)


Yes, it is -the- basic assumptions for special relativity. And no, it doesn't just change our understanding of photons, it changes everything, since it's all connected. All theories I've looked at so far have (at least in higher versions) incorporate relativitity.

(Another basic assumption, this time for general relativity, is the equality of inertial and gravitational mass, which is not a self-evident thing. However, so far no difference has been found. (see http://en.wikipedia.org/wiki/E%C3%B6tv%C3%B6s_experiment)


I think what he's getting at is that there's this value "c" that's really important to physics appearing in equations like e=mc^2 and determining the absolute upper bound on speed, and by the way, we used to assume that photons traveled at c, rather than their actual rate of 99.9975% of c.

I don't know whether changing c by this amount would break many experimental results. Adding a rest mass to photos sounds potentially revolutionary.


Einstein based his theory on the maximum speed at which information can propogate. That's always been assumed to be the speed of light (photons). It may be possible that there is something else that can propogate information faster (e.g. neutrinos). There would still be an upper speed limit, but it wouldn't be the one we thought it was :)


Isn't the speed at which photons travel, by definition, the speed of light?

Photons speed up and slow down routinely, depending on what medium they're traveling through. c, as it is used in the equations of relativity, is currently believed to be equal to the speed of light in a vacuum. But, with my limited knowledge of GR, my understanding is that gaika is correct and that the rest of the theory can still stand if this equality is broken.


This isn't technically correct. Photons always travel the same speed but in certain materials they are absorbed and emitted by atoms, causing their apparent speed to slow down.

A photon's instantaneous speed is always the speed of light.


>>Photons speed up and slow down routinely, depending on what medium they're traveling through

Do they really? As far as I know their speed is always constant in any medium. They just seem to slow down because they get absorbed and re-transmited. That is where the lost of velocity comes from. When traveling between one atom and another, which is a vacuum, they are always traveling at the speed of light.


It wouldn't break current theory for photons to have nonzero rest mass?


Some days ago I saw some people here saying that billions of dollars was wasted in LHC. I would like to know their opinion about new findings.

The post in question http://news.ycombinator.com/item?id=2943950 .


This wasn't the LHC. It was a non-LHC experiment at CERN.

(I agree with your sentiment, however.)


Agreeing still with the sentiment. However, if this is correct, the data from the LHC will confine the new theories, which will be very important.


I would personally want more money going into scientific research (NASA, CERN, Fermilab, and others) than into war and supplying ammunition to soldiers and whatnot.


Just for reference Einstein's Special Theory of Relativity - does not strictly prohibit the existence of particles that travel faster than the speed of light, it only prohibits acceleration in an inertial frame to reach the speed of light :

As per Wikipedia ( http://en.wikipedia.org/wiki/Faster-than-light )

"In special relativity, while it is impossible in an inertial frame to accelerate an object to the speed of light, or for a massive object to move at the speed of light, it is not impossible for an object to exist which always moves faster than light."


True, but such a particle would violate causality and allow messages to be sent backwards through time (including, presumably, the message "Hey, don't send this message") which causes potentially nasty causality violations. Great Scott!


Is it Bbc's policy not to capitalize acronyms? It's CERN, not Cern, and they did it multiple times.


discussion by people who actually know stuff: http://blog.vixra.org/2011/09/19/can-neutrinos-be-superlumin...


Superluminal Travel is a sexier name.


If I calculated correctly, that's only about .0025% faster than c. Practical implications?


> Practical implications?

Print out 90% of physics articles from the century. Get big trash can. Put printout in trash can.

Of course i'm exaggerating, but if the story is true, this is going to be big.

Speed of light is the main block for things like time travel and a hard limit on communication speed.

It doesn't matter that it is only 0.0025%, what matters is that it can be done(if it is confirmed). As a(very poor) analogy, the first CPUs were probably about 0.0025% the speed of the modern ones, yet look where we are today, in no small part thanks to them. The point is, if true, this may well open a whole new realm in physics, and who knows what we will find there.


This is it in a nut shell. While the ramifications are hard to predict at this moment, they are huge. E=mc^2 prob won't effect practical physics that we deal with on a daily basis. but as far theoretical physics the impact is enormous. time space fabric warping, wormholes (existence, creation and artificial stabilization), how fast the universe is speeding apart, our calculations for where things are in our solar system and beyond and a whole host of other things physics relies on the formula to calculate.

what i am most excited for is the possibility to travel faster than the speed of light. conventional theory dictated that the more mass you have the more fuel you would need to break the lightspeed barrier, however the closer you approach the speed of light the more fuel you would need therefore increasing your mass infinitely putting you in a perpetual null loop. but if that equation changes and we know that there are particles that travel faster than light speed limit we may have to re-examine this theorem. Especially with new power sources being discovered on the atomic and quantum levels. The splitting of those bonds if harnessed yield promising potential. Not to mention the existence of antimatter which releases catastrophic amounts of energy when in contact with matter. We can't seem to find any right now so were limited with how much we can make which is a miniscule amount. Limits the production seem to be on a physics level as opposed to a technological one. But who knows if the speed of light is up for discussion almost anything can be in a table.


Excellent question. Doesn't seem like "must faster", but if the speed of light isn't the limit, what is the limit?

All this reminds me of the beginning of Mostly Harmless: "One of the problems has to do with the speed of light and the difficulties involved in trying to exceed it. You can't. Nothing travels faster than the speed of light with the possible exception of bad news, which obeys its own special laws. The Hingefreel people of Arkintoofle Minor did try to build spaceships that were powered by bad news but they didn't work particularly well and were so extremely unwelcome whenever they arrived anywhere that there wasn't really any point in being there."

I doubt the experiment will be repeated, but it sure would be awesome if it broke Einstein.


And if there is no limit? (e.g. just practical limits, such as energy availability for our current technology/knowledge)


It should be impossible to accelerate a neutrino to 100% of the speed of light. It would take an infinite amount of energy. So getting it to go even faster is really impressive :)


> Practical implications?

Using ion propulsion and gravitationally assisted slingshot trajectories, you'd get to Proxima Centauri about 173 days 12 hours sooner, which sounds pretty cool if you ignore that that's 173 days of a 19,000 year trip.

By nuclear pulse propulsion (EDIT: invented, but still theoretical, thanks adrianN) taking 85 years, you'd get there about 18 hours sooner.


Nuclear pulse propulsion is actually invented, we just didn't build it yet.


I'm not smart enough to explain it properly, but basically if information can be transmitted faster than the speed of light, it makes time travel possible. Read "Black Holes and Time Warps" By Kip S. Thorne for a 624 page explanation of why.

More likely, it's experimental error.

.0025% is meaningless compared to the scale of the earth, but over hundreds of light years it's huge.


http://en.wikipedia.org/wiki/Time_travel#Via_faster-than-lig...

If we can send information even a little to the past, we can then send it a little more to the past, and by induction we can send information anywhere in time.

So we have infinite power computers. Singularity starts here.


If you can send data a little into the past, compute something, then send the result back to the beginning, you can do an infinite amount of computation in a finite amount of time, just by reusing the same time over and over again. https://secure.wikimedia.org/wikipedia/en/wiki/Closed_timeli...

edit: OK, this blows my mind (from https://secure.wikimedia.org/wikipedia/en/wiki/Tachyon#Speed):

It has been argued that we can avoid the notion of tachyons traveling into the past using the Feinberg reinterpretation principle which states that a negative-energy tachyon sent back in time in an attempt to challenge forward temporal causality can always be reinterpreted as a positive-energy tachyon traveling forward in time. This is because observers cannot distinguish between the emission and absorption of tachyons. For a tachyon, there is no distinction between the processes of emission and absorption, because there always exists a sub-light speed reference frame shift that alters the temporal direction of the tachyon's world-line, which is not true for bradyons or luxons. The attempt to detect a tachyon from the future (and challenge forward causality) can actually create the same tachyon and sends it forward in time (which is itself a causal event).


Your mind isn't properly blown until you read the next paragraph which ends with the thought: "Although remote, the possibility of backward causality is not a real challenge to the principle of causality, but rather a novel way of understanding an additional aspect of it."



I believe you did calculate correctly as I got .0024619%


Headline is wrong - this experiment has nothing to do with the LHC



The paper is now up: http://arxiv.org/abs/1109.4897


Considering that neutrinos can pass through a light year of solid lead unimpeded[0], they must face a lot of challenges in determining when a neutrino has been generated and when they are finally able to detect one arriving at its destination.

[0] http://hyperphysics.phy-astr.gsu.edu/hbase/particles/neutrin...


Wow..... this is capital news! There is a flaw in our whole comprehension of physics and our basic understanding of everything that surrounds us. There might be faster that the fastest we have ever imagined and based our calculations on.

Yet, we see news like "Kardashian hubby's bad first impression" on the first page of sites like Yahoo "news", disappointing...


No, no capital news.

Almost certainly this is a measurement error or some other mistake. If a couple of years from now nobody found an error and a bunch of independent groups reproduced the results, then it's time for capital news!


Bearing in mind 'spooky action at a distance', shouldn't we be expecting to see some kind of mechanism that would imply some kind of particle accounting behind the scenes, that might appear as objects or information moving faster than the speed of light?


Assuming you're talking about entanglement (an effect of quantum mechanics), then no. Since QM describes the behaviour of particles (such as the 'faster than light' neutrinos in this experiment), you can't use the behaviour of particles to describe QM.


Old CERN publication from 1998 suggesting that neutrinos could travel faster than c http://cdsweb.cern.ch/record/340078/files/9712265.pdf


"...the researchers noticed that the particles showed up a few billionths of a second sooner..."

I am impressed that their measuring instruments have a precision that is (statistically significant-ly) smaller than a billionth of a second.


How do you think gigabit ethernet works?


I don't have the specs, but 100Mbit operates at 31MHz. Remember, quad-twisted pair. So it is likely 1Gbit operates at ~300MHz.

Now, 10Gbit probably does operate with a period less than 1ns. (1 billionth of a second)


Over Cat5e, it uses all 4 pairs and 2 bits per symbol, so it's only 125M symbols per second (per pair). I was thinking of fiber, which I think also uses 2 bits per symbol so it would need 500M symbols per second.


The most striking thing that came across to me in the report is that the time difference is 60ns thereabouts!

Thats a whole lot of time delay, not a "tiny fraction" as the reports say. Light travels about 18m (~60 feet) in that time. A modern cpu will have processed more than 100 instructions in that time interval. So you don't actually need a CERN quality clock to measure it.

So if this turns out to be due to systematic errors of various kinds, I'm wondering what other measurements from the lab will be cast into doubt as a consequence!

Exciting result and possibly exciting times ahead for physics.


This seems to be somewhat related: http://en.wikipedia.org/wiki/Varying_speed_of_light

"The variable speed of light (VSL) concept states that the speed of light in a vacuum, usually denoted by c, may not be constant in most cases."

"In 1998, Magueijo teamed with Andreas Albrecht to work on the varying speed of light (VSL) theory of cosmology, which proposes that the speed of light was much higher in the early universe, of 60 orders of magnitude faster than its present value."


But for now, he explained, "we are not claiming things, we want just to be helped by the community in understanding our crazy result - because it is crazy".

is it really crazier than nothing being able to travel faster than light? it sure as hell isn't crazier than particles flitting into and out of existence at the subatomic level all the time. the fact is, that most things science has uncovered, particularly in the last 100 years, have been nothing short of hysterical. and for this, dear researchers, i thank you.


Ok. Call off SETI until we master neutrinos for communication.

They pass through matter as it wasn't there AND could be faster than light? Aliens must have been stupid to use radio waves.


Taking a different point of view, perhaps there's nothing superluminal about this: what if the neutrinos are indeed travelling at 'true' c, whereas light is 'tiring out' over the same distance? (http://en.wikipedia.org/wiki/Tired_light)

The idea of cosmological redshifts would need revision, and thus our understanding of the universe. But at least c remains unassailable...


Why does the link title claim that the result is at the LHC? The LHC is not involved in this story at all.

Edit: other than the obvious reason of "linkbait", I mean.


Well, the LHC is at CERN and this happened at CERN.


Tesla postulated that there are faster speeds than light-speed... so he might have been right after all.

Or maybe he was just thinking of ludicrous speed.


Advancing the spaceballs empire! It might be happening just now!!!


Lol maybe the speed limit is marked by the neutrinos not by the light... By now... It will be interesting to see how this developes.


The research preprint is now available: http://arxiv.org/abs/1109.4897


How does one "fire a bean from a particle accelerator near Geneva to a lab 454 miles away in Italy"??


Using neutrinos, which rarely interact.


I wonder how accurate their time measurements are. If nothing can exceed the speed of light, what makes them think they can rig together hardware that will accurately measure speed at that magnitude?


Light is not that fast. In the time it takes for a photon to travel between your computer screen and your eyes, modern CPUs will be able to execute many instructions.


I'm very skeptical about this. It seems like every time a "supposed" scientific discovery gets published in the media before a peer-reviewed journal, it's almost always wrong. The same thing happened with cold fusion, Ida (aka. Darwinius) the supposed "missing link" between human and ape, and NASA's "supposed" discovery of bacterium growth through arsenic instead of phosphorus.

It's always disheartening when science gets reported to the media at such an early age of discovery (i.e. the point where it hasn't been criticised and the lead investigator himself says he isn't so sure). It just opens the flood gates for the retarded news to make of moronic headlines like: "Roll over Einstein: Law of physics challenged"


This wasn't characterized as a discovery, either by the reporter or the scientists involved. But it is really interesting. Either you didn't RTFA or I'm not sure what you're complaining about.


I agree, few articles that I have read about this (including BBC) clearly mention this as speculation and a possibility. However probably there are people who still take this as a fact after reading the article. In my mind media has been surprisingly careful with this not to claim this as a new discovery.


I'm extremely ignorant when it comes to physics, but I hope it's for real. Maybe a big science breakthrough would interest people enough to start investing more in education.


As an aside, I just finished reading Walter Issacsons - Einstein biography.

Easily the best biography I have ever read. If you don't like biographys, this will change your mind.


I wonder how they accelerated uncharged particles.


xkcd on this piece of news, spot on as usual:

http://xkcd.com/955/



Keep in mind that given that they said a few billionths of a second that the adjustment to c would be on the order of:

  (5 nanoseconds) / ((732 km) / c) = 0.000204776269 percent
update: AP says it's 60 nanoseconds so

  (60 nanoseconds) / ((732 km) / c) = 0.00245731523 percent
Also in different units:

  ((60 nanoseconds) / ((732 km) / c)) * c = 16 479.1646 miles per hour
Interestingly, the earth moves about 60,000 miles per hour in relation to the sun, could this be explained by frame dragging / Gravomagnetism? This is pure speculation only because the numbers are with in the same order of magnitude.


as the CERN to Gran Sasso is direction from North-West to South-East, it would be interesting to see the full data as during nighttime the neutrinos would travel in the direction of the Earth traveling around Sun - thus slowed in the CERN/Gran Sasso frame - and during the day they would be moving in the opposite direction - thus they will be boosted in the CERN/Gran Sasso frame. May it be that they run experiments only during business hours ? :)


Reading around a bit I believe they used GPS to correct for anything like that. Presumably, if there were frame dragging effects they would be measurable via the GPS satellites, although that could also be the source of an error.


this isn't about adjusting c, c will always be c

the speed of light in a vacuum is a known quantity, it isn't changing. this is about relativity being void


I think the suggestion is we've mis-measured c, and it should be redefined as the speed these Neutrinos are reaching (if it's a constant maximum)


Speed of light under influence of any gravitational field is not c.


Even if the neutrinos arrived faster than light could travel (that distance), it does not mean the neutrinos traveled that fast, only that far...

For example: when muon-neutrinos transform into tau-neutrinos, in that moment they could "tunnel" through space-time in some fashion that appears as faster-than-light travel. Or something happens to their probability waves to make them go from existing at point A, to point B.


54% in this random ass poll say "No way (e = mc^2)" http://www.wepolls.com/p/2879014/Do-you-think-scientists-at-...


I was under the impression that polls are reserved for use in deciding issues in climate science!


Couldn't it be that the lack of interaction with the electro-weak force allows neutrinos to exceed the speed of a photon in a non-vacuum environment? Or is the article saying that the neutrinos exceed c?

edit: More questions, since photons are massless they should be unaffected by gravity except in the sense of following the curvature of spacetime. Could the non-zero mass of the neutrino mean that if you changed the experiment so it fired away from the earth mean that the neutrinos would travel "slower" than photons?


The article seems to be saying that the neutrinos appear to have exceeded c. If it were just neutrinos moving faster than photons through some non-vacuum environment, that would be completely non-notable - experiments have already succeed in slowing light down to almost nothing, after all.


"And of course the consequences can be very serious."

Understatement of the century.


Well Hellen Blavatsky was right after all; Science must finnally accept not just the existence of matter but of Spirit as well.

Besides scientific calculation of speed is based on the assumption that the velocities of transmission of all the colors are the same. Namely, if we call "W" length of the wave of any given color in ether, and "V" the velocity of transmission of that color, and "N" number of vibrations or waves of that color per unit of time, then the formula connecting these is W = V/N. But in order to calculate the rates of vibration Science assumes that the various Vs of the different colors are all equal to one another. This assumption is false and it has been proven that the velocity of red and of blue light are different.This is an important phase in human History-If this is true then the age of the Stars has Just Began.


wat


Does this mean I have to buy a new textbook! BUT they are so expensive!!!


We're witnessing an increased vibrational rate of photons; it is, for those who are ready, a beginning of a higher density experience.


Good to hear, I've always wanted to be able to gain weight without looking fatter.


I bet Fox News and the Republican Party leadership in general are going to somehow spin this into their anti-climate-change, anti-science, anti-intellectual, pro-religion messaging.


If all climate research was as honest as this it might be harder to put a spin on it.


and so it begins :)




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: