I don't know a lot about image processing algorithms but clicking "Auto" on google photos tweaks basic stuff like exposure, contrast, highlights, shadows, vibrance etc. so the image has a lot more "punch".
* Some areas of the world are just naturally fairly flat and monochromatic, so a dynamic contrast/saturation/brightness adjustment (e.g. one that would turn the darkest pixel black, the brightest pixel white and linearly map the rest between these extremes) would not work for these areas.
* The available satellite imagery has been captured and processed in a variety of ways depending on the region, so a constant contrast/saturation/brightness adjustment might work well in some places, but overcorrect things in other places (especially urban areas in the US and Europe tend to already be fairly saturated and contrasty).
Basically, doing this well would involve a whole bunch of testing and fine-tuning. And since not even Google (the source of the imagery) seems to do this, I decided not to bother: Keeping the data basically the way I receive it is easy and "honest".