Hacker News new | comments | show | ask | jobs | submit login

Quick math and google says no:

9,000 MPH = 0.00001342 C

Threshold for relativistic effects is .01 C according to this paper:


EDIT: Rereading that paper, they're focused at the particle level, which may or may not make this irrelevant. I know anecdotally that GPS satellites have to take relativity into account. Geostationary satellites move at 1.9 miles per seconds, which is 6840 mph - quite a bit lower than our centrifuge. That being said, the precision required for GPS means that very small changes due to relative effects have a rather large impact. In short, where there is motion, there is relativity. Is it enough to measure here? Possibly - sticking a small microcontroller and having it report the time would be interesting. Is it enough to matter? The answer is relative.

I believe in GPS satellites relativistic effects are due in greater part to the difference in gravity between the surface of the earth and the satellite, more so than the special relativity effects.

According to a reference[1] from Wikipedia[2] the effect would be 45 μs for general relativity 7 μs for special relativity, combining into a 38 μs drift.

So although you are technically correct (45 > 7) they are both significant.

[1] http://www.astronomy.ohio-state.edu/~pogge/Ast162/Unit5/gps.... [2] https://en.wikipedia.org/wiki/Global_Positioning_System#Hist...

Relativity has almost no effect on GPS at it's worse it would have an accuracy degradation of less than 1cm, there are considerably bigger sources of errors in GPS than relativity and various corrections to those. Shadows, echoes, atmospheric effects and the doppler effect have considerable effect on GPS accuracy, relativity will not.

Relativity has a massive effect on GPS. If unaccounted for, the error would increase by 11km per day.

EDIT: I might be wrong on the “11km” part. See reply


No it doesn't http://www.physicsmyths.org.uk/gps.htm This would only be an issue if you had substantial drift between different sattlites the 38 microsecond delay is irrelevant since all satellites in the constellation have a similar delay.

It's also less relevant when solving using the more common method which is time-tagging at the receiver, relativity might be a bigger pain if you are doing time-tagging at the transmitter but since the clocks of different satellites are not synchronized and can differ by as much as 1ms that delta would be more of an issue than relativity.

If you are doing normal GPS accuracy which is 15m you don't need to use relativistic correction in the receiver, is you are solving for almanac data you do not need to do relativistic correction in the receiver, you only really do relativistic corrections when you are using centimeter level accuracy GPS and even then you have considerably bigger corrections to do.

Yes relativity was accounted for when GPS was designed, but it's by no means would send you to China, not even to the wrong town.

What GPS needs to work is for the receivers to be able to calculate a proper geometric range delay or at least be constantly and consistently wrong when calculating the delay for different satellites in the cluster.


Discounting relativity would reduce accuracy, but it would be consistent and constant and so would not present that much of an issue, it's also by far not the biggest offender when it comes to corrections.

Thank you. Is it fair to say that the 38µs correction done in the satellites is more for GPS’s other use? That is, as a source of accurate timekeeping to the receivers? It sounds like positioning is not affected like I thought, but the designers felt the need to engineer a solution to the relativity problem nonetheless.

There are different corrections for relativity, one is the "one time adjustment" that all clocks got which is on average closer to earth clock rates but it's not 100% constant since general and special relativity do not affect all satellites in the same manner and they all got the same clock correction.

The other "relativistic" correction is one done in the receiver and is needed for geometric range delay calculation, basically taking into account that light doesn't travel in a straight path, we've kinda known that since the early days of classical optics - Fermat's principle or the principle of least time.

Now while it's true that we are using the "relativistic" version of this in our calculation solving the classical or even ignoring it all correctly as long as we account for other effects would still allow you to find a starbucks.

Since GPS was initially intended for military use it had other uses such as time keeping that can be used to synchronize communication, encryption and other things these errors can be cumulative whilst navigation errors are often not since the drift would be more or less uniform with all satellites you see.

Yes, putting MC and check it's time after a device-lifetime of use would be quite an interesting experiment.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact