
Mars rover camera project manager explains 2MP camera choice - k33l0r
http://www.dpreview.com/news/2012/08/08/Curiosity-interview-with-Malin-Space-Science-Systems-Mike-Ravine
======
Sharlin
It should be noted, as well, that the Hazcams and Navcams are "build-to-print"
exact copies of the ones that flew on the MERs in 2003. So most of the images
we've been seeing thus far are actually taken with tech much older than the
Mastcams, probably designed around the turn of the millennium. And there's
nothing wrong with that -- they are very reliable and do exactly what is
needed.

Source: [http://www-
robotics.jpl.nasa.gov/publications/Mark_Maimone/f...](http://www-
robotics.jpl.nasa.gov/publications/Mark_Maimone/fulltext.pdf)

------
asmithmd1
The mast camera is actually two cameras with zoom of each camera fixed at two
different values:

<http://msl-scicorner.jpl.nasa.gov/Instruments/Mastcam/>

James Cameron lobbied to make zoom lenes for the cameras so they could be set
at the same zoom level to create stereo pairs for 3D images.

JPL had "problems designing the lens without using wet lubricants which would
require battery-sapping heating"

Difficult trade-offs must have been made everywhere - who knows where that
extra battery power went that the zoom lens gave up.

~~~
ghshephard
Given that the Plutonium power core is putting off 2000 watts of heat in order
to generate 125 watts electricity, I find it surprising that they don't have
enough heat to keep wet-lubricants heated sufficiently. But - that's an
awfully big Rover, and it does get pretty cold on Mars...

2000 watts at 14 years in kilowatt hour - 245,000 kilowatt hours.

~~~
hexagonal
Remember, it's got a 14 year lifetime because the half-life of Pu-241 is 14.1
years. At the end of that time, half of its mass will be gone, and it'll be
producing half as much heat. It won't produce a steady 2KW for 14 years and
then _bing,_ go out like a lightbulb. Power produced will slowly fall over
time.

~~~
eru
You mean, half of the plutonium mass will be gone. It will be almost
completely be replaced by other elements. Though I don't know how much energy
they give off in radiation, or whether they are stable.

------
radarsat1
They should eventually release a series of regular-quality images taken
sequentially as the camera pans left and right, and have a competition to see
who can squeeze the most resolution out of them using image enhancement and
fusion techniques. There are lots of interesting stochastic methods out there
(e.g. "compressed sensing") that have application for this kind of thing. It
would make a great academic competition.

~~~
eco
During one of the press conferences Ravine mentioned they had hundreds of
images of a single spot on the ground taken by MARDI, the descent camera,
after it had landed and that a member of his team was working on using those
to build a super-resolution image of the surface. I like the idea of a
competition though.

------
danso
I think the explanation is satisfactory, but how often, if at all, do other
components of such projects change, up until lift off? I imagine engines and
other propulsion parts won't be upgraded much after the original spec, even if
such components even remotely followed the same curve of improvement as
digital chips do. But what about other types of sensors used by the rover?
What is the most recent piece of tech used by the rover?

In any case, it's kind of surprising that by 2004, NASA engineers wouldn't
have proposed a solution that anticipated vast improvements in digital sensor
technology, so that something, in 2009-2010, could be "dropped in" (relatively
speaking, not literally) as a replacement.

Of course such a design feature is going to take way more planning and
resources than it would for the holiday consumer camera lineup...but a) this
is NASA, some of the best of the best engineers. And b) while panning-and-
stitching is always a solution, doesn't that have additional operational risk
of its own? Additional panning requires additional mechanical movement and
attention to moving parts.

~~~
arrrg
Last minute changes are made. Or at the very least I know of one such change
in the case of Curiosity.

The dust covers over the haz cams (the navigation cameras on the belly of
Curiosity) were added in the last minute. Here is the engineer who implemented
them writing about the covers:
[http://forum.nasaspaceflight.com/index.php?topic=29612.msg93...](http://forum.nasaspaceflight.com/index.php?topic=29612.msg938852#msg938852)

Basically, the Phoenix lander (landed on May 25th 2008) kicked up more dust
than expected. They were worried and did a review for Curiosity, but found out
that only the haz cams (because of their location on the belly) were in
danger, so they decided to add dust covers to them (and also kept them
transparent to see whether there really was a dust problem – as you could see
from the first photos with covers still attached there very clearly was).

My guess is that stuff gets changed and updated a) if there is money and
resources left or b) if the mission is in danger if you don’t change
something.

The 2MP sensor is very clearly good enough. Any update in resolution would
give you diminishing returns – so something like that gets pushed back.

~~~
mturmon
It's all about risk. It's not about "leftover money".

Cost, schedule, risk. They are the fundamental resources for a project like
this. Cost and schedule are more familiar than risk.

~~~
arrrg
So if it’s about cost it’s about leftover money. That’s one and the same. (One
just doesn’t sound as professional as the other.)

~~~
mturmon
You offered two possible explanations for the use of a 2MP camera, (a) money,
(b) risk. I was saying that, in this case, it did not really have to do with
money.

My second sentence above was trying to point out that many people
underestimate how important risk is to a space mission.

~~~
arrrg
Oh, no, I wasn’t claiming that this change had anything to do with money. In
this case it was clearly related to risk. I was just trying to make a general
point.

~~~
mturmon
Thanks, understood.

------
nitrogen
They can generate low-res, closeup 3D images of the ground using the stereo
hazcams and manual colorization. It's far from what stereo zooming mast
cameras would have given them, but I still think it looks cool:
[http://nitrogen.posterous.com/curiositys-view-of-mars-in-
pse...](http://nitrogen.posterous.com/curiositys-view-of-mars-in-
pseudocolor-3d)

And yes, I'll probably keep posting this image in nearly every thread that
mentions Curiosity's cameras until NASA starts giving us more color imagery.

------
ericcholis
2004 technology aside, this is called the "Good Enough Factor". Obviously
every super geek at NASA wants an 20 MP camera with full zoom,100 year
lifetime, 3d, blah blah blah. But, they are working in very exact
specifications and budget. So, you have to opt for the option that satisfies
the Good Enough Factor.

What will get us closest to what we actually want, without totally breaking
spec and screwing with the time and monetary budget.

~~~
H_L
Everyone is missing the point here. The problem isn't that these cameras are
old tech and using outdated CCD censors. If these were the best images we've
seen to date of Mars, you're right, there'd be a "Good Enough Factor" — the
best images we've seen from a NASA mission, but not as good as modern DSLRs.
Understandable.

The problem, that I think a lot of people are missing, is that Viking 1, from
1976, took higher quality pictures of Mars.
[http://f.cl.ly/items/0k2w2d1C1O3w3e0t300f/NASAQualityDegreda...](http://f.cl.ly/items/0k2w2d1C1O3w3e0t300f/NASAQualityDegredation.png)

~~~
danielweber
You are comparing the very first initial images from Curiosity with the final
processed images from Viking. Also, the haz-cams on Curiosity are there only
to help Curiosity see where it is going. Whatever design trade-offs they make,
they need to make the priority be "don't break the 2.5 billion dollar rover
that is supposed to last for over a decade."

If you use a low-res camera to take 1000 images of the same thing, you can use
software to make a high-res image from those.

------
rwhitman
I would like to see this result in a frustrated James Cameron building his own
Mars probe for filmmaking

~~~
hugs
The lack of an on-board microphone could/should do that, too. :-) Pictures are
a great, but a full-30fps video (with audio!) would greatly enhance our
feeling of what it's like to be on the surface there.

~~~
brudgers
Mars has a very thin atmosphere, so a microphone would capture very little
data if any. _Star Wars_ aside, there isn't much sound in the universe.

~~~
waterlesscloud
Mars is also windy, and microphones can be made to be very sensitive. Plus,
there might be some interest in hearing the dust from one of the common dust
storms hitting the rover.

Not a high priority in terms of data, but potentially interesting nonetheless.

~~~
brudgers
It appears that NASA already sent a microphone to Mars.

And you can listen to it:

[http://mars.jpl.nasa.gov/msp98/lidar/microphone/mic_found.ht...](http://mars.jpl.nasa.gov/msp98/lidar/microphone/mic_found.html)

[Edit] Never mind, _Mars Polar Lander_ crashed.

[http://en.wikipedia.org/wiki/Mars_Polar_Lander#Communication...](http://en.wikipedia.org/wiki/Mars_Polar_Lander#Communications_loss)

~~~
InclinedPlane
NASA sent microphones to Mars twice, but neither were used due to technical
reasons (one because the vehicle crashed).

------
summerdown2
This is a serious lack of vision for such an iconic company. Good science and
technical ability, doubtless. But someone skimped on a real public relations
goldmine here.

------
gaius
Because he's an actual scientist not an Internet rasterbator.

------
kghose
I really liked the article - it was very coherent and easy to follow.

BTW, is there a full-res movie of the descent? I've only seen the 'thumbnails'
stitched together of the heat-shield and the parachute.

~~~
johno215
Not yet. Only 20 or so of the full frames of the descent have been downloaded
so far. Will probably be a day or more before they are all down.

You can see the frames that are down here:
<http://mars.jpl.nasa.gov/msl/multimedia/raw/?s=0>

~~~
kghose
Thanks for the pointer. This really brought home to me the 'baud rate' issue
of interplanetary communication.

Right now, I'm so used to streaming movies at home that I forget the
challenges of communications across such vast distances .

Perhaps our children/grand-children will be using the faster interplanetary
internet and will recall these days with the same whimsy as we do now our
twisted-pair modem days.

------
ck2
It would be neat if one day they send rovers in modules - one nuclear power
supply to last dozens if not a hundred years, then a mini-robot sent to
replace the modular camera with a 15MP one after they upgrade the bandwidth
from 2mpbs to 10mbps and then one day 100mbps.

That 15-30 minute pingtime is a problem that cannot be overcome unless they
find a faster than light wave that can be used for transmission - or a way to
manipulate quantum entanglement for communications.

~~~
danielweber
I think the problem is that, if you are going to make a second launch, it's
nearly always better to land some place different. We will accumulate a bunch
of data on Gale Crater over the next several years, but we have essentially
_zero_ data on a whole bunch of very interesting parts of Mars.

~~~
pavel_lishin
With enough supplementary shipments, and enough patience, a rover could stroll
all over the planet.

------
jbattle
Can anyone explain what the key limits on bandwidth are when communicating
from Mars? Is it just a matter of higher bandwidth requiring higher energy
consumption?

~~~
sliverstorm
It's wireless transmission. Your bandwidth is a function of the frequency of
your communication channel, and long-distance transmission generally uses low-
frequency signals (and as a result, low speed).

The one way to get around this is to use a multi-channel link, that is to say
you communicate over several frequencies simultaneously. This is more
difficult, both technically and because you must find unused frequencies.

------
sigkill
I'm a bit disappointed that they didn't take this as a chance to try out
cobbling up a "radiation box" and put commercial equipment in that. Surely,
that would be a good experiment data point that could be used in future
endeavors.

~~~
kronusaturn
It sounds like they already tried something like that on ISS, and the
commercial cameras still had problems.

------
senthilnayagam
just curious, why they are releasing only black and white pictures

~~~
sp332
The sensor is black and white. This lets them collect the most accurate
information about how much light hits each part of the sensor. To take a color
image they put a color filter in front of the lens, then they can recolor the
black and white image according to the color of the filter. If they take
multiple pictures with different filters, they can reconstruct a color image.
<http://areo.info/mer/>

~~~
hexagonal
Actually, the MastCams on Curiosity are bayer-pattern color sensors.

~~~
sp332
Wow, I had no idea. Thanks for the info.
[http://www.nasa.gov/mission_pages/msl/multimedia/pia15109.ht...](http://www.nasa.gov/mission_pages/msl/multimedia/pia15109.html)

------
redact207
As for the 3D factor, given the way the Curiosity can move, can't it just snap
one image, move slightly to the side and snap it again for stereo LR?

------
treelovinhippie
Why not change the system so that new technology is able to be fitted?

~~~
roc
The limiting factor is the testing and integration, not the new technology.

if it takes X years to do sufficient radiation and integration testing on new
technology, you can't possibly include any technology "newer" than X years
old. And trying to keep yourself open to include the latest-technology-you-
possibly-can means you need to have manpower ready to do that testing at the
last possible moment, which has scheduling implications on everything else.

So you need to prioritize which technologies you most want to be able to
integrate in their latest/greatest form. And slightly fancier pictures are
just not going to be that high on the list.

------
bertako
Please. The quality of images from the MastCam will be sufficient for the
plebs.

------
mtgx
Now that I find out that the main reason for using a 2 MP camera is because
that's what the specifications were in 2004, I'm a lot more disappointed than
if it was just about the speed of transfer between Mars and Earth.

2 MP cameras were in high-end phones in 2004, but you'd expect _NASA_ to use
something a little more advanced than what was available in phones in 2004.
How much more would it have cost them to use a 5 MP one? $100 more if they
chose one in 2004 and stuck with it, or $10 more if the camera was added just
a year ago. So this makes me think that they just didn't think this would be
such an important factor, compared to say making the robot work.

~~~
rimantas
What do you think extra megapixels would give you? Judging about quality of
the camera by pixels count is silly.

~~~
SammoJ
In a mission to gather data then clearly extra megapixels give you extra data.
You cannot argue against that! We aren't talking amateur photography here
where the quality of the shot is important and a decent lens beats higher
megapixels, we are talking acquiring measurements of the amount of photons in
a discrete spatial region. Clearly a higher resolution of sensor might allow
scientists to see something not visible in lower resolution images. Although I
do admit that in this case I understand the choice due to specification and
testing, but to suggest that extra megapixels do not give more information is
silly.

~~~
stan_rogers
Actually, the opposite is true. Photon absorption/detection is a quantum
event, and limited by probability. For a given sensor chip size (and
technology generation), fewer, larger sensels are going to provide samples
that are statistically closer to the Absolute Truth. (Averaging repeated
samples will reduce the error further.)

Using a well-corrected lens of an appropriately longer focal length, and thus
a narrower field of view, with or without panoramic stitching, will provide
_at least_ [1] the same linear resolution of a given subject, but with less
sampling error.

[1] _At least_ , since apochromatic correction is easier in longer focal
length lenses provided that no super-wide-aperture bokeh heroics have gone
into the design. Rectiliearity (the absence of barrel or pincushion
distortion) is also easier to achieve. Flare can be reduced without inducing
undue mechanical vignetting, increasing contrast.

