It must be mission critical to spread awareness since their budget cuts are getting worse and one way to correct that is to get people excited about space exploration. They now put a considerable effort in PR, involving people in the mission, sharing most of the information and making professional videos. And i believe this is why we see twitter accounts of most astronomers, satellites, rovers and other NASA properties. The NASA's ustream has been very popular lately and they have presence on every major social network. And curiosity was named after an essay written by a student in a competition organised by NASA for school students.
Nasa has been doing this for some time now, Sojourner, and MERs (Spirit & Opportunity) have also been named by student competitions.
I'm impressed with the reception technology that allows 'direct to home' communication at all given the power levels available.
If a transmitter and a receiver is in space, what's the furthest distance the receiver can go away from and have a 1 watt signal still above the noise floor? Assume the transmitter is a perfect omnidirectional.
About 9200 miles.
I was assuming absolute worst conditions, at 1 watt PEP.
With hardware noise reduction, we can drop easily to .1 watt and still keep everything else consistent.
Next, if we choose a directional antenna, we can also greatly magnify directed output. No sense in directing energy in the ground, is there?
We can also have receiver systems (dish arrays) that can provide great gains in reception and transmitting.
I think NASA should receive big kudos for taking that real risk.
Hmm. If storage continues to improve as fast as it has been, once the space infrastructure is there, this might be cost effective. We have a lot of data to send.
In particular, we are going to need to have a local cache of the internet on Mars. At best, the average minimum latency is 225 million kilometers / the speed of light = 12.51 minutes.
Vast data barges, roaming the solar system, keeping everyone in sync....
Building an interplanetary Internet is going to be very, very fun.
On a related note, never underestimate the bandwidth of a truck filled with...
Well I was going to say blueray disks. But the cheapest I can find bluray disks are 3 cents per gigabyte. 3 TB harddrives are easily had at 5 cents per gigabyte. If the data only needs to go 60 miles, then 120 drives ($18,000) in a truck would get 1 TB/sec, or 8,000 mbps...
Ouch. I'll go with a megabit connection. Once upon a time that sort of thing worked for bulk data ;)
0 latency on funny cat videos from Uranus.
I predict somewhere today at Google an engineer just got an idea for his 20% project. "Red Cache"
Straight snapshots would probably be the only viable option though. And maybe make the policy, once a decade do a complete update. And every year, do "any webpage accessed by colonists in the past year will be updated. And the top 5-15% of webpages by popularity on Earth."
And during the year, the top 5% of webpages, and any webpage with specifically Mars relevant content would be updated "wirelessly."
The numbers would change constantly, depending on bandwidth and costs and whatnot. But the question "how to build the best asynchronous shadow Internet given such and such constraints" is fascinating.
-- A message from your friendly local hacker wizard
It’s not like that rover is driving around autonomously. It has some autonomy, but it can’t decide what’s interesting and what’s not, what has to be investigated and what not.
This means if you're already in Mars orbit you need fuel(TM) to escape. One might consider a slingshot around Deimos or Phobos but I don't think that'd be helpful enough to justify its cost (could be wrong, but I've never seen it suggested in any Mars->Earth mission plan).
(assuming 8-bit bytes)
But in a pedantic sort of way, the computation should add extra bits for ECC since the drive has ECC bits already embedded in it, so a 1TB drive probably has more than 10 Tbits it is probably closer to 15 - 20 terabits.
Edit: Did the research. MRO's antenna can handle .5 to 4 megabits depending on the distance between planets. Wow, having to wait for flybys is a huge bottleneck. http://mars.jpl.nasa.gov/mro/mission/communications/commxban...
The current orbiters are primarily for imaging, so it makes sense that they're as low as possible. If you were always going to have several surface missions, it might make sense to have higher orbiters primarily tasked with communication relay.
One thing is that they didn't know Gale Crater was where they were going to land until pretty soon before launch. There were a number of other sites. And if they chose one further from the equator, that satellite wouldn't necessarily be in the right place.
Apparently NASA had plans for launching an orbiter specifically as an optical communications hub for Mars but scrapped it in 2005:
"As of 2005, NASA has canceled plans to launch the Mars Telecommunications Orbiter in September 2009; it had the goal of supporting future missions to Mars and would have functioned as a possible first definitive Internet hub around another planetary body. It would use optical communications using laser beams for their lower ping rates than radiowaves."
(OT: Note the picture at the bottom, the astronaut is one of ours: Austrian Space Forum, a private space R&D org.)
2. Set up an 80-meter dish in your back yard, with a team of NASA RF experts.
With step (2), the authorities will zero in on you in a second, so the emphasis is definitely on the cyber security of the ground infrastructure and not on SSL-over-space-link-communication.
Although the process seems a bit intimidating - http://deepspace.jpl.nasa.gov/dsn/features/dsnbuilt1.html
On the other side, there's always a moron out there to crash the party. So even if I wouldn't use encryption, I wouldn't be surprised if they use it.
Encryption typically doesn't make messages longer.
EDIT: removed specific type of data
Edit: my very rudimentary understanding of information theory tells me that there's no "average" or "optimized" data, only varying amounts of entropy (AKA compressibility). My limited knowledge of radio and electrical transmission says that transmitted data is encoded to make effective use of the transmission medium (e.g. 8b/10b encoding, 8-to-14 modulation, TMDS).
So, if by "optimized" data you mean "high entropy" data, encryption should have no effect on that. Or, if by "optimized" you mean encoded for the transmission channel, you would run your encryption process before the channel encoding process.
Since TMDS is a superset of 8b10b than it's a fair assumption to think it has the around the same overhead.
Now I know we're only talking encoding here, not encryption but still, I wonder:
What kind of encoding are they using, because smaller the encoding is, the weaker the encryption is (Unless I got this wrong...)?
Now if all my assumptions are correct could it be possible that they use an encoding that makes encryption useless?
8b/10b has exactly 25% overhead, this is the difference between baud per second (what the line is capable of transmitting, which is on the 10b side), and bits per second (which is the decoded 8b data). If you have a 10kilobaud/second line, you'll have an effective data rate of 8kbit/second.
Encryption is done before the encoding stage, the encoding scheme doesn't care and isn't affected by the encryption.
Maybe I thought this was more primitive than what it really is :/
My exposure to this stuff is limited to a few exercises in a Matlab class years ago, plus what I read on HN and Wikipedia. Based on what I understand, from a data transmission perspective, there is no difference between efficiency and consistency. There is a theoretical maximum amount of data that can be transmitted using a given medium, bandwidth, and error rate. The only goal is to get as close to that maximum as possible.
Also, there's no shame in admitting you're wrong. In fact, every opportunity to be wrong is an opportunity to learn.
I'm not ashamed for being wrong. If that'd be the case I would have been ashamed my whole life heh.
When I said consistency I was referring to low-error rate in the signal. I should have said that but couldn't find the correct word for it.
Efficiency meant efficiency in the bandwidth usage.
So in that sense, there's always a tradeoff between efficiency (use 100% of the bandwidth to transmit real data) and consistency (make sure that 100% of the sent data is error-free)
Now, when I first wrote my comment. I had the feeling that communication was very basic commands. like a 2 bit integer for movement direction (00 up, 01 down, 10 left, 11right). So if you'd use such a scheme to communicate instruction to the rover, there would be an obvious overhead (encoding 2 bit into 4, 10, etc).
But now that I really think about it, instruction sent to the rover are probably very complex, which increase entropy, which in the end, made my first assumption wrong, is this clear? I hope it is ;) If not, I can rephrase !
1) plain data initial stream.
2) compress it.
3) encrypt it or/and sign it.
4) add error-checking or even error-correction information.
The reason you do encryption (3) after compression (2) is because you don't want encryption to affect the compression rate of the initial stream.
If you were to do it in the reverse order (first encryption and then compression) then yes, the resulting message could be bigger.
For example, a very naive implementation might create a thumbnail from every pixel where x or y is odd and then when/if you want the original image you can get the data required to put the image back together (it would be a similar sized thumbnail but with the even pixels) For this naive approach the thumbnail would likely be much larger than you want and would not help at all :)
The orbiter can talk to earth at up to 6Mbps depending on how far away we are at that time.
6Mbps at that distance is a staggering feat of engineering.
Basically, TCP/IP doesn't scale to meet the constraints of deep-space networks (high latency, asymmetric data rates, intermittent connectivity). It just breaks down. That's why there's a lot of active research in DTN (delay/disruption tolerant networking and ad-hoc networking protocols). ION (Interplanetary Overlay Network) is the JPL implementation of the DTN protocols, and has been flown in a demonstration capacity on some extended-mission/end-of-life Deep Space missions. Much of it has been open sourced at this point.
DTN book for anyone interested in the area.
Anyway, he compared the energy used to transmit from the probe to that of a flower petal falling from a height of six feet.
As Mars and Earth orbit around the sun, their distance relative to each other is constantly changing. The minimum distance from Earth to Mars is 35,000,000 miles, and the maximum is 249,375,000 miles.
Since light travels at 670,000,000 miles per hour in a vaccuum, we can calculate the minimum and maximum round-trip latency:
Minimum round-trip latency
= (2 * 35,000,000 miles) / (670,000,000 miles per hour)
= (0.167164 hours) * (60 hours per minute
= 10.0299 minutes
Maximum round-trip latency
= (2 * 249,375,000 miles) / (670,000,000 miles per hour)
= (0.744402 hours) * (60 hours per minute
= 44.6642 minutes
But it can be as low as 3 minutes or as high as 21 minutes, depending on the distance from Earth to Mars. Which of course varies as they orbit the sun.