Hacker Newsnew | comments | show | ask | jobs | submit login

Well outside the atmosphere we've already got a plan, its called the James Webb Telescope [1] :-)

One of the weird things about light is that the photons that hit the camera sensor in California are not the same photons that hit the camera sensor in New York. So you could, if you chose to, add the two pictures together which would increase its brightness (more photons) and not change the content of the picture. The trick of course is figuring out which pixels in the camera sensor were getting the same (or nearly the same) photons.

Since you are taking a picture of the stars, which are far enough away that parallax effects won't change their relative position, I should be possible to map the position of the stars in each image, combine it with the pointing vector from the accelerometer, to then create a projection matrix that would allow you to back project the camera pixels into an idealized focal geometric plane.

Now you have a map of all of the various image pixels with respect to their projection onto this plane, and you can then add together like pixels. Or generate a pseudo 5mP image where each pixel is comprised of a million sub-samples. (sometimes I wish I could draw in this editor)

[1] http://en.wikipedia.org/wiki/James_webb_telescope




One hitch might be the precision of the sensors (accelerometers, etc) in the phones might not be sufficient to make this work!

I'm all for tearing down a million old phones, removing the sensors, and blasting many "micro" telescopes into space!

-----




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: