I think there's a case to be made even for "average hue applied to the whole image": it's very easy to look at a black and white planet photo and assume that the whole thing is made of grey rock (since our own familiar Moon is pretty close to exactly that). I made that mistake myself for quite a long time. So using the average hue is one step better than that, or at least no worse.
But my impression has been that they often do something more sophisticated: they have a high resolution greyscale image and a lower resolution color image (maybe just a few pixels, or maybe as much as half or a quarter of the greyscale resolution). Colorizing the higher resolution image using the lower one in that case seems entirely acceptable. (In fact, doesn't JPEG generally do something very similar for color images almost all the time?)
Do I have that wrong? Or is it just that NASA isn't good about reporting the relative resolution of the color component of their "true color" images? (That would be really nice, come to think of it: "This image of Enceladus has detail resolution 150x150 and color resolution 50x50.")
But my impression has been that they often do something more sophisticated: they have a high resolution greyscale image and a lower resolution color image (maybe just a few pixels, or maybe as much as half or a quarter of the greyscale resolution). Colorizing the higher resolution image using the lower one in that case seems entirely acceptable. (In fact, doesn't JPEG generally do something very similar for color images almost all the time?)
Do I have that wrong? Or is it just that NASA isn't good about reporting the relative resolution of the color component of their "true color" images? (That would be really nice, come to think of it: "This image of Enceladus has detail resolution 150x150 and color resolution 50x50.")