Hacker News new | past | comments | ask | show | jobs | submit login

From an optical perspective, how does the argument dismissed at the start of the OP break down? Is there some reason the moon's light, gathered from a lens covering hundreds of acres, carries insufficient energy to light a fire in concentrated form?



The optical argument goes like this: Measure the brightness (in energy per square meter) right at the moon's surface. No matter what you do with lenses, you can't concentrate moonlight beyond this level.

You can make an enormous lens where all the energy coming off a particular acre of moon goes through it. But you can't focus all of it onto a spot smaller than an acre, no matter what you do.


Thanks for the explanations.

The current top comment here seems to agree with my original intuition, though.

https://news.ycombinator.com/item?id=18739120


As a few people pointed out, the reflection off the moon ruins the ├ętendue, so there's no concentrating it beyond what all the rocks on the surface already experience.


Thanks again for your help. I think that led me in the right direction.

This is the optical argument for maximum concentration given conservation of ├ętendue (the same page has an optical argument for the conservation):

https://en.wikipedia.org/wiki/Etendue#Maximum_concentration

The angle subtended by the moon is approximately 0.54 degrees, or about 0.01 radians, so the maximum concentration factor is about 10,000. The moon provides about 0.1 lux of illumination, so the maximum illumination you can achieve by optics is about 1000 lux. The sun subtends about the same angle, and we receive 30,000-100,000 lux from it, and the maximum illumination you can achieve from concentrating it optically is about 1B lux.

I'm willing to believe that you can't light a fire from the moon, if the intensity's a million times lower.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: