Entropy argument - correct in the sense that using radiation from black body we cannot use lenses to heat another body to the temperature higher than original. Easy to understand why - the first body has a temperature, radiation has the same temperature, if we apply the radiation to another object it will not heat up more than the radiation's temperature.
Also the argument about impossibility of concentrating light into a dot is correct (although even if it were possible we still would not be able to get higher temperature - light would not be energetic enough for that). The important part is - we could concentrate light into a dot only if it consist of parallel rays - i.e. only for an object that is infinitely far away.
Moon surface temperature argument is incorrect. A body at 100 degrees Celsius does not radiate in visible spectrum, so the light we see is not produced by Moon's temperature. It is reflected Sun light. So Moon's temperature doesn't matter. Moon surface does absorbs some light, changing spectral composition from about 5.7kK (Sun's surface temperature) to about 4kK. So we should consider moon to be a part of optics not emitter.
Hence the question is now - can we concentrate moon light enough so that intensity at the concentration point is higher than thermal loss into environment (only then we will be able to raise temperature in the concentration area enough for combustion - remember that light is "hot" enough for this)? I don't have answer for that - need to do calculations. What can be a deal breaker? Remember that Moon is much closer than Sun, so rays come to us even less parallel, so the area into which we can concentrate light reflected from the Moon is even larger than the Sun's, so together with lower intensity of light from Moon we might have trouble achieving the necessary intensity for combustion. However big enough lens probably will work.
And yes - I'm a physicist by training.
In short - perfect reflector preserves etendue, but imperfect does not.
Thanks for pointing this out; I seem to have learnt of a new phenomenon I wasn't previously aware of: https://en.wikipedia.org/wiki/Opposition_surge
At this point I almost think an experimental measurement of temperature equilibrium would be publishable from one of the old re/frac/flectors you can put your head into.
The idea is that you can "organize" or "revert" any ray bundle from a system of non-absorbing lenses and specular reflectors, but if your reflector has billions of tiny irregularities it's not viable to build such a system (it's equivalent to an ideal diffuser, in which light is isotropically reflected). The ideal diffusion process is clearly not reversible by itself: if you shine a beam onto a diffuser it spreads the light; if you expose it to the same light (with reversed directions), it again diffuses it instead of reverting to the original beam. In theory again the physical laws of electromagnetism are time reversible, but in practice the effort to revert some systems might be too demanding (you can even do better -- see Maxwell's demon); manipulation of physical apparatus and information acquisition/manipulation itself has a cost that surpasses any gains.
In practice, any reflector is imperfect, and therefore has nonzero emissivity, and therefore goes towards thermal equilibrium with its environment.
Sure, if we reshaped the moon into a giant mirror, we could use a lens light a fire using the “moon”.
Finally paper burns at ~233c but plenty of things ignite at lower temperatures. Hay’s ignition temperature is ~130c which considering the moon is 127c and you will concentrate extra reflected sunlight seems very viable.
However, the earth’s atmosphere blocks a lot of moonlight, so it’s more likely to work in a pressurized cabin at high altitude or spacecraft. Even your lenses are going to be an issue.
PS: A lunar day is almost a month long. They are casting long shadows which means they are a long way from noon. https://en.m.wikipedia.org/wiki/Apollo_11#/media/File%3AAldr...
It's a bad mirror, not a radiator.
To the extent that the moon acts as a greybody under sunlight, it is correct. And like most things, the moon will be close enough to a greybody that you could use the surface temperature as a first order approximation. It isn't necessary of course, since you can easily just directly estimate the amount of scattered / re-radiated energy from the amount of sunlight falling on it, without needing to use the temperature.
>Hence the question is now - can we concentrate moon light enough so that intensity at the concentration point is higher than thermal loss into environment (only then we will be able to raise temperature in the concentration area enough for combustion - remember that light is "hot" enough for this)?
No, because the body you're trying to heat up will act as an approximate greybody, so (neglecting conduction and convection which could further lower the temperature) its temperature has to max out at the Stefan-Boltzmann temperature represented by the total radiation.
1. As you mentioned, the moon acting as greybody absorbs part of the radiation and scatters the rest. In the second sentence you mention that the moon's temperature is enough to describe this radiation. In the third sentence you mention that you don't need moon's temperature to do that. So which is which?
Correct answer - to describe re-radiated energy we need moon's temperature, but to describe scattered we don't. We can ignore re-radiated, since it is not visible light and not hot enough. We can use scattered, since it is visible and hot enough.
2. Not sure why it is "No" if you are agreeing with me. Also not sure what you mean by temperature represented by total radiation (as you mentioned in part 1 there are two parts - re-radiated - at temperature of the moon, and scattered - at temperature of Sun).
To see that there is a problem with your argument - first note that the spectrum of black-body radiation dominates spectrum of grey-body radiation  (i.e. if black-body radiation is not in visible spectrum, the grey body will not be in visible as well). Then think that at 100 degrees C there would be no visible radiation. So the light that we see from moon cannot be this cold. Its coming from sun and it is hot enough (since it is in visible spectrum).
When dealing with a gray-body, the equillibrium temperature of the body will be equal to the effective temperature of the incoming light, which will be equal to the effective temperature of the re-radiated + scattered light, since at equillbrium energy out equals energy in. So, assuming the moon is a graybody (and most objects tend to be roughly graybodies), we can use the surface temperature of the moon in our calculations instead of the effective temperature of the light that falls on it.
Effective temperature is about total power (per the Stefan Boltzmann law), not about the color of the light (per Wien's law).
Compare https://en.wikipedia.org/wiki/Effective_temperature with https://en.wikipedia.org/wiki/Color_temperature
No, it won't, and no you can't. The OP already pointed out that the Moon is too cold for its blackbody radiation to reach the visible. All the visible light from the Moon is reflected from the Sun. The Sun's radiation's blackbody temperature is the ultimate limit, here, not the Moon's.
To have a combustion we need two things - light of high enough temperature and light of high enough energy concentration. The two are not the same. All visible light has high enough temperature. However the concentration is the problem
Why do we need high concentration for combustion? If we don't supply enough energy to compensate for heat loss, the area will never get hot enough.
By the way - interesting corollary of this is that even with sun lights - if we have a mechanism that takes away heat fast enough (say by blowing cold air at the area) we would not be able to reach combustion with large mirror in direct sunlight.
An interesting demonstration of this idea is putting a paper cup full of water into a fire. It won't burn where the water is touching it and eventually the water will start to boil.
What do you think of the argument at this link, regarding maximum concentration achievable through optics?
I get that that implies a maximum concentration factor of about 10,000, for a light source subtending an angle of 0.54 degrees (the moon's angular size from Earth, and also approximately the sun's.)
Dylan16807 helped to prod me in this direction. https://news.ycombinator.com/item?id=18739190
Thermodynamics arguments are always good but you really need to understand that most of them describe closed systems under equilibrium. Real world is messier and if your understanding of thermodynamics is shaky its easy to get to wrong conclusions. Black and grey body discussion is in same realm, so we will avoid it here.
Let's consider piece of wood and try to see what it takes to combust it. We need two things - oxygen and high enough temperature. We are keeping this piece of wood in air, so we have oxygen. The temperature is how fast particles that compose piece of wood are moving. The wood consists of molecules, which are in turn consist of atoms. You can imagine a molecule is a bunch of atoms connected with each other by springs (the springs are created by electromagnetic forces when atoms lend and borrow electrons). The higher the temperature - the larger oscillations of these springs. When the temperature is high enough some of these springs can break and combine with oxygen to release energy - combustion.
Suppose light strikes some piece of wood. What happens? Photons hit molecules and they can interact with an electron or proton - they exchange momentum (and energy), which means that one of the atoms in a molecule gets a bump - springs would start oscillating harder.
Note, low energy photons cannot swing springs a lot. We need photons of high energy to swing spring to high temperature (ok - this part is too oversimplified, I can extend on this if there are questions). The energy of photons depends on their waivelength - the shorter waivelength - the more energetic photons. So we need photons with energy that could break molecular bonds (springs) to be able to heat up to combustion temperature. Visible light definitely has photons of such energy.
So we start sending light down our piece of wood, it starts heating up. But it sits in air and probably fixed by some supporting stand. This piece of wood starts exchanging heat with surrounding materials. To be able to overcome this we need to send lots of photons on this piece of wood. How do we do this? We use lens to collect the photons from broader area and send it down to the wood. Note that it doesn't matter where photons came from, were they scattered or produced by sun, what temperature of scattering surface was, not even what temperature of sun was. All what is matter - can we collect enough high energy photons. So we would need to calculate what is energy flux of visible light photons at the surface earth and see if we could come up with a lens to focus these photons on piece of wood to create sufficient intensity.
We know that we can see Moon in visible light, so it does sends bunch of energetic photons to us. All these photon are coming from moon direction (so we don't care that moon scatters them in other directions as well). So the task is to find what is the energy flux of these photons at earth surface and what size of lens we would need to get sufficient intensity.
Whilst this is true, it is also a bit irrelevant. Gamma rays are higher energy than visible light but you don’t need to focus gamma radiation (should that be easily possible) to start a fire, you can do it with visible light; or infrared light (which is lower wavelength again) at a high enough qanta
Equivalently, higher energy photons are higher frequency.
If the Moon was actually a giant mirror-lens thing that was the size of the Moon, but perfectly shaped to reflect and concentrate sunlight shining on it into as small an area as possible on the Earth's surface, you'd start a fire with that very easily. Probably turn a county-sized area of the crust into lava or something actually.
On the other hand, if the Moon was a perfect reflector, but a big sphere/hemisphere pointed at the sun, then it would receive a lot of high-energy radiation, but reflect it back out in every direction all over the universe. I'm not sure of the math for that offhand, but it sounds difficult to concentrate it back to something even as concentrated as direct sunlight. I suppose it would involve considering the fraction of the total solid angle the reflecting area of the Moon would shine at. Even the entire Earth's surface would be a really tiny fraction of that area, so sounds impossible to concentrate things back to fire-starting intensity.
Assuming these scenarios are basically right, then the actual Moon is a lot closer to the second than the first. So starting a fire isn't practical with any lens system you could build on Earth. Maybe you could build some massive mega-lens thing in space near the moon that concentrated enough of it back on the Earth to be near-sunlight brightness. Or just set up a more manageable sized reflector on the Moon itself, or in space, to reflect sunlight in a more concentrated way to a small area on the Earth.
I'm sure they don't want smoke in the observatory, but you could do it in a box. Then you could fill the box with variable oxygen levels (to make up for altitude) and substitute N2 with Ar/Kr/Xe to change the heat dissipation.
This is an interesting argument. Can I not reflect some sunlight off a mirror, then do the magnifying-glass-to-start-a-fire trick in daytime? Doesn't the mirror stay cool? Isn't the moon just a (poor) mirror for the sun's light?
Your mirror in sunlight works because the reflectivity or albedo of the mirror is very high relative to whatever target you're lighting on fire.
In a magical closed system where radiative heat transfer was the only factor, objects of differing reflectivity would eventually reach temperature equilibrium through black body radiation.
We aren't interested in closed systems though. The moon's temperature is set by the equilibrium between incident solar radiation and black body emission, most of which flies off into deep space making the system very open. Just like your mirror's temperature is the equilibrium of incident radiation, black body cooling, and convective cooling in Earth's atmosphere.
If the moon had a high reflectivity and/or a powerful cooling mechanism like convection, its equilibrium temperature would be far lower than the temperature of its emitted + reflected light. Unfortunately the moon's albedo is just 0.12 and black body radiation is all it's got, so the modest difference between its temperature and that of its light isn't enough to start a fire.
If this seems to fly in the face of all common experience, it's because things that look white aren't actually white in the infrared. Their emissivity is lower in the optical spectrum (where they absorb sunlight) than in the infrared (where they emit), so they are cooler than things that look black to us. In short, they aren't grey-bodies.
The situation is also more complicated because, as you say, that things we are used to also cool by other mechanisms.
According to the Stefan-Boltzmann Law, the energy put out in blackbody radiation is proportional to temperature to the 4th power. Therefore something that is 5000 degrees reflected off of an object with albedo 0.12 is putting out 0.12 times the energy it originally did, while something that is 2500 degrees only puts out (1/2)^4 = .0625 times as much. So the "temperature of the Moon's light" should be more than hot enough to light something on fire if it is focused right.
As a sanity check, compare how much light the moon puts out as a black body in shadow with what it reflects from the Sun. As another sanity check, compare how bright the Moon is versus a fire.
What am I missing here?
The moon is diffuse, so an incoming ray of sunshine is spread by the optically rough surface of the moon from an incident solid angle of 6 * 10^-5 steradians out into 2 pi steradians of the night sky, or a reduction in angular concentration by a factor of about 100,000 (totaling ~1 million after the albedo is accounted for).
It is like you are looking at the sun through a mirror so rough that the image of the sun is blurred over literally half the sky. Because this process does not create new photons, the blurred image must be far, far dimmer. This circumstance corresponds to the "most we could do" with lenses and mirrors focusing the moon, which is to fill the sky with an image of the moon/"blurred sun".
Unconcentrated moonlight corresponds to the same picture, except we only see a "cutout" disk of this blurred sun-image which is the size of the moon in the sky. Our crappy moon-mirror does not fill our vision, it is a porthole letting through only a tiny fraction of the blurry sun-image.
And of course if you imagine yourself as the ant under the magnifying glass, with your entire sky filled with moon, there is no way you could spontaneously become hotter than your moon-y surroundings.
However the key point is this. The Moon is in reality 400 times smaller than the Sun. Which means that an optimally placed lens can actually make the image of the Moon 400 times smaller, for 160,000 times as little area. If the lens is big enough that this target area is mostly losing heat through black body radiation, again it should wind up over half the temperature of the surface of the Sun. Which is more than hot enough to start a fire.
I don't understand this bit. I don't think there's any power law involved in reflection, if something has an albedo of 0.12 then it just reflects 0.12 times the incident radiation, doesn't it?
I think I agree with your overall point, though, which is that the moon isn't a black body radiator (well it is but only at a couple of hundred degrees C at most) but is just reflecting the sun's light (and those photons are hot enough to start fires).
Fiber optics could probably collect and point enough moonlight to light a match. If we call fiber a sort of flexible lens, then lenses can start fires.
If we imagine a full sphere of inward-pointing fibers, each one "looking" at the moon, then we see moon-surface in all directions from within this contraption. We are in a thermal bath of moon-temperature. We will not get hotter than the moon.
And fibers are not needed to create this circumstance. The same situation ("moon visible in all directions") could be created with a few lenses and mirrors.
The problem with sunlight reflected off the moon is that it's like having a very low transmittance ND filter in the optical path. There aren't a whole lot of photons entering the aperture, and as the material begins to heat, there are more photons and overall probably more energy going out. (I'm not giving an answer here, just some thoughts.)
Note too that i said match. Getting tiny bit of a matchhead hot enough to decompose isnt the same as burning an ant. A matchhead on the surface of the moon would probably ignite just fine (150*+o2).
(Well, the fiber you’re thinking of is. Fluorescent fiber is not, but that’s another story.)
> None of the fibers would "look at the moon". Fiber isnt the same as lenses.
I think you meant to ask "how is a fiber different from a lens?" instead of asserting "fiber isn't the same as lenses," which turns out to be completely wrong.
To a physicist, a lens is a shaped piece of refractive medium-- that is exactly what a fiber is. There is no magic inside the fiber; it does not add any photons. You are looking through a shaped piece glass, through which you will see a (possibly very distorted) image of what's on the other side.
In this case, it's the moon that is on the other side. The most you could do is surround your match/ant/whatever with fibers/lenses which are showing the moon on the other side of them. And from this fact it is unavoidable that you cannot make your surrounded subject hotter than the surface of the moon. If you are trying to say anything to the contrary, I'm sorry, but you are Flat Wrong.
Any other configuration of fiber would give you less moonlight than completely surrounding the subject, so it's not going to improve the odds of starting a fire.
> Note too that i said match. Getting tiny bit of a matchhead hot enough to decompose isnt the same as burning an ant.
I think you meant to ask "does the material or shape change anything?" instead of asserting "it isn't the same," which again, in the context of this question, turns out to be completely wrong.
The maxiumum temperature that can be imparted by a lens/mirror system doesn't depend on what it's focused on at all. In the long run, the ant/match/whatever will reach that maximum temperature and not get hotter. From a thermodynamics perspective, it is like putting an object in an oven of a particular temperature-- It doesn't matter if you put in a brick, or a cake, or match, or an ant; eventually they will all be 250 degrees if the oven is set to 250 degrees. In this case, the "oven" is the moon, and it's set to about 120C. A match combusts at about 600C. There will be no combusting of the match by putting it in the moon-oven. Sorry. End of story.
And if I can give some advice: If what you're hearing doesn't make sense to you, it is always safer to ask a question than to blindly assert your gut-guess of how you think it is. Someone who thinks they're an expert but spouts nonsense looks like a fool. Someone who asks a question, though, looks curious, which is smart. These two people have the same degree of knowledge, but one comes off looking much worse than the other. That's why asking questions is better, especially when you don't know the knowledge of the people who are listening.
Archers see this. Many archery sights use fiber to make nice illuminated dots, without batteries or leds. The light appearing out the end of the fiber is brighter than the skin, a rare practical use of "naked" optical fibers.
The above are junk fiber (little internal lensing) but you can see the effect.
No, reflection and black-body emission are different. The moon primarily produces the former.
Moonlight is not the result of the moon glowing incandescent.
The surface is very cold, its emissions as a black-body radiator are in the far infrared.
Moonlight is white light with a color temperature of several thousand Kelvin. This is reflected sunlight and has nothing to do with black-body radiators.
You could start a fire from the light of a single star if you had a big enough lens. You could also start a fire with the light from a hand mirror at the distance of the moon if it reflected the sun's light at you. But you'd need a very big lens.
That's my impression as well. A blackbody has albedo 0. The moon has an albedo of around 0.12. While I suspect you can't start a fire from moonlight in practice, I don't think the arguments in this article are correct.
I guess if you're reading that as some kind of absolutely logical argument. I read stuff like that as an abstraction which just kinda works in the messy real world. Pretty much like how the typical explanation for how a wing works turns out to be an over-simplification. It only partly works that way. Actual wings are complicated, but in the aggregate, they just get enough air molecules to go downward to net out the forces to keep the plane from going downward. It turns out that there are a lot of mechanisms contributing to this all at once. (Which is something else he discusses in that series.)
Ivanpah may be able to, I wonder if we can borrow it at night for an experiment - https://en.wikipedia.org/wiki/Ivanpah_Solar_Power_Facility
edit - And by whom. Should probably get it in early, just in case.
It's mostly not reflected. It's mostly scattered. If it were specularly reflected, there would be a bright spot on the moon (where you could see the reflection of the sun), and you'd be able to concertate light from that to light a fire.
The only reason the moon's surface doesn't reach such high temperatures is that the moon's surface is not thermally isolated from the rest of the night sky or from the body of the moon itself (which is also not isolated from the night sky).
The more general argument based on etendue, which applies to the moon, is: you can't make the incoming light any brighter than it is on the surface of the source (which doesn't have to be the original emitter, but can be any point along the path of the light). As a corollary this happens to mean you can't really make something hotter than a rock on the moon.
Suppose the moon was actually a flat mirror. Standing on the mirror-moon, you look at the ground. In most directions you would see the darkness of space (with a few
reflected stars), but in one spot you would see about ~(0.5°)^2 solid angle of extreme brightness - the sun's reflection in the mirror. Standing on the mirror-moon, you could certainly use a magnifying glass to heat something to ignition temperature (ignoring the lack of oxygen) by making its environment that bright using the reflected light.
Similarly, if the moon was a mirror, you could certainly use the reflected light to start a fire standing on earth (provided you're lucky enough for the moon/earth/sun to line up just right so that the reflection of the sun is visible through small angle subtended by the moon from earth). It would basically appear as another sun in the sky when lined up properly.
In reality, instead of reflecting light like a mirror, the moon scatters light in all directions. Standing on the actual moon, looking at the ground, what you see is a lot of moon dirt, all of which about equally bright (ie. not very). The scattering smooshes the sun's light out in all directions, ensuring there's no visible "reflection" of the sun in any direction.
The most you can do with this scattered moon light is make the environment of an object as bright as the (mediocre) brightness of the moon rocks. But moon rocks already experience that environment of mediocre brightness, and reach only 100°C, so you won't be able to make your object much hotter than that.
I'm bothered by the modifer "much". If you are indeed talking about a physical principle, shouldn't this be an absolute limit rather than a suggestion? How much hotter does physics allow you to go? Are you sure it's not enough to allow ignition?
Along those lines, I'd assume that the surface temperature of the depends on the moon's shape and thermal conductivity. If I were to change the moon to be an ultra-thin and highly heat conductive hemispherical shell rather than a solid sphere, I'd assume the surface temperature would drop.
Assuming the amount of light reflected remains the same, does this imply that the maximum achievable temperature on earth with a magnifying glass drops as well? I don't see any physical reason that it should, but your logic would seem to imply that it must. Can you explain?
It's an absolute limit on the amount of incoming irradiance you can create to your object. The actual equilibrium temperature it reaches will depend on additional factors like how well your object loses heat (eg. by conduction) compared to a moon rock.
In this case, the temperature of moon rocks is probably a reasonable upper bound of the achievable temperature of an object on the earth:
- Moon rocks are in vacuum, while something on earth is in contact with air and dissipating heat by convection.
- Moon rocks are in contact with the surface of the moon (~100°C), whereas an earth object is in contact with the ground, or your hand, or whatever (~37°C, assuming your hand). So heat loss by conduction will be greater on the ground.
If you rigged up something to suspend your object in vacuum without touching anything so that conductive heat losses ~0, maybe you could get something slightly hotter than the average surface temperature of the moon. But not hotter than a well placed moon rock that already happens to be making near 0 contact with the moon's surface (due to standing on a point or something).
> If I were to change the moon to be an ultra-thin and highly heat conductive hemispherical shell rather than a solid sphere, I'd assume the surface temperature would drop.
In that case much more heat would escape around to the unlit side, and the moon's surface temperature would reach somewhere between the "day" (~100°C) and "night" (~-200°C) temperatures. Say around -50°C. In that case the surface temperature will be less representative of that achievable for an object on earth. A moon rock touching the ground would be in contact with -50°C, which is colder than the 37°C for an object held in your hand.
[The surface temperature of the moon is] an absolute limit on the amount of incoming irradiance you can create to your object.
This is true for a black body, but why are you convinced this is true for the actual moon? I think we agree that a more reflective moon could have a lower surface temperature while increasing incoming irradiance on the earth. And we both agree that the moon is partially reflective. Doesn't this mean that the surface temperature is not an absolute limit?
I think the correct statement is that the intensity of light from the sun to the moon gives a limit on both the surface temperature of the moon (highest if we assume the moon is a blackbody) and a limit on the amount of sunlight reflected toward the earth (highest if we assume the moon is a perfect reflector).
Since the moon absorbs about 90% of the light incident on it, we can assume that the surface temperature is lower than it would be if it was a perfect black body, presumably reaching a temperature corresponding to a sun that was about 10% less strong. The 10% of light that is reflected, although diffused in all directions, is much more intense when viewed from earth than low energy blackbody radiation that is also emitted. We know this intuitively because the sunlit moon is much brighter at night than the non-sunlit portion, and because the visible light is more energetic than the infrared, but could integrate across the energy spectrum to find an exact answer.
As such, unless we are willing to make some additional assumptions, I don't think we can make any firm claim about the the maximum temperature achievable on the earth using lunar reflected sunlight based only on knowledge of the surface temperature of the moon. In practice, the scattered sunlight doesn't provide a lot of energy, so heating with it will be difficult. But it's energy incident on the earth that matters, not the temperature of the lunar surface.
Would you agree with this summary? Are there additional assumptions that you think should be added that would provide the tighter limit you want? Alternatively, is there something other than "[The surface temperature of the moon is]" that you think I should have substituted for "It's"?
It's the radiance (brightness) of the moon (as perceived at the moon) that is an absolute limit on the incoming irradiance you can create, because of conservation of etendue.
Separately, the fact that moon rocks (which experience that exact amount of irradiance) reach some given temperature X°C while losing very little heat to conduction (certainly less than an object on earth would), shows that this level of irradiance is insufficient to heat up your kindling above X°C.
This argument has nothing to do with the moon being a black body, or an approximate black body, or any such thing. Just the fact that moon rocks, which are exposed to this light (with very little heat conductive losses), function as a kind of a thermometer which tells you how hot that light can make something; and the answer is "a bit over 100°C"
> Since the moon absorbs about 90% of the light incident on it, we can assume that the surface temperature is lower than it would be if it was a perfect black body, presumably reaching a temperature corresponding to a sun that was about 10% less strong.
No... An object with 90% absorbance absorbs 10% less light energy, yes. But it also emits 10% less light energy, so the two effects cancel out and it reaches the same equilibrium temperature as a black body.
Well yes, this is what it should say! Is this our disagreement? Because for me (and I think for most other dissenters in this thread) the whole problem we have with Munroe's argument is that he keeps coming back to the surface temperature of the moon as the limiting factor. If he was simply to say that the moon is not bright enough, then we'd probably all agree.
the fact that moon rocks ... reach some given temperature X°C ... shows that this level of irradiance is insufficient to heat up your kindling above X°C.
I think this is the real point of dispute. We agree that this is true if we are only considering pure blackbody radiation. What's not clear (at least to me) is that this equivalence is still true when you include the directly reflected light. That is, no one thinks that you can start a fire using only the thermal infrared light from a dark moon. The question is whether it's hypothetically possible with a sufficiently bright sun and sufficiently reflective moon, without raising the surface temperature. Can you point to something that makes this argument more directly?
the two effects cancel out and it reaches the same equilibrium temperature as a black body
I need to learn more about this. I have trouble thinking it applies correctly here, because it's assuming the moon is a perfect gray body. I think this assumption falls apart if it's actually reflecting light, which in fact we know it is. Or am I wrong? Does a silver mirror in space actually end up at the same equilibrium temperature as a lump of coal? I guess it could. This wouldn't harm much argument (the argument just requires that the temperature not increase), but would indicate that I'm not viewing things correctly.
Summarizing, I think the point of dispute is whether the surface temperature of an object in space can always be reasonably estimated from its brightness (and vice versa). We agree that it can be if it's a perfect black body. We agree that it's mathematically true if it's a "gray body". We disagree (I think) as to whether it's appropriate to make the simplification of assuming that all stellar objects are sufficiently close to "gray bodies" for the math to hold.
 Here's the outline of the counterargument. Start with a blackbody moon. Estimate that with perfect optics you can heat an object to X. Now increase the reflectivity of the no-longer-black-body, noting that the surface temperature does not increase. I'd argue that when you increase the reflectivity, the moon gets brighter, and thus you can heat your object to a higher temperature. You seem to be arguing that because the surface temperature remains the same, the attainable heat stays the same, even though you can collect more reflected energy.
Yes. It takes longer to reach the equilibrium, but it does reach the same equilibrium temperature.
>  Here's the outline of the counterargument. Start with a blackbody moon. Estimate that with perfect optics you can heat an object to X. Now increase the reflectivity of the no-longer-black-body, noting that the surface temperature does not increase.
Right, so, let's say the total solar power received by the blackbody moon is Ps. The moon increases in temperature until it reaches temperature Tb, at which the power emitted equals the power absorbed Pb = Ps. That is the equilibrium.
Now let's let the moon reflect (scattering) some light, by giving it a realistic albedo of 10%. Then of the solar power received at the moon, 0.9Ps is absorbed and 0.1Ps is scattered. But this albedo also causes the moon to emit 10% less light, so it now emits 0.9Pb at the same temperature Tb, hence it reaches equilibrium 0.9Ps = 0.9Pb at the same temperature.
Note that the total light energy leaving the moon is now: 0.9Pb (emitted thermal) + 0.1Ps (scattered) = 0.9Ps + 0.1Ps = Ps. Just the same as the amount that was emitted as a blackbody. So in fact it's not any brighter (in terms of power), just has a different spectrum.
The first is that the applicable "mirror" question here whether the surface temperature of the sunlit side of a slow rotating astronomical body is independent of reflectivity, not whether the eventual core temperature of a uniformly lit body is independent. Are you confident that the same reasoning applies? I'm not yet.
Next, if we are phrasing our question as to whether one can start a fire with a magnifying glass, we clearly do care about the difference between visible sunlight and low temp infrared. For Munroe's argument to really work, the surface temperature needs to be proxy for the collectible visible light that would be used by a magnifying glass.
There's also the directionality: none of the visible light is going to be reflected toward the "dark" side. The thermal radiation is also directional, but not to the same extent. As we move toward greater reflectivity, presuming a bright full moon, we do get more of the total energy available on earth. How much more? I don't know.
Lastly, which we haven't discussed, there are options for insulating the heated object on earth that are optically transparent (to allow the concentrated light in) but infrared reflective (to prevent thermal radiation from escaping). I think this "privileges" the reflected sunlight so that we might indeed be able to achieve a higher temperature than the reflective moon surface in vacuum.
I feel like you recognize these factors also, by your caveats that you might be able to get "a little bit" higher than the surface temperature. Without committing to a number or methodology, your implication is that this "little" must is small relative to the surface temperature of the moon, rather than small relative to the optical temperature of the sun. While I agree that the surface temperature is related to the achievable temperature for collected reflected light, I still don't think that the exact temperature is a hard limit.
I'll try to bow out here and not take up more of your time. You've definitely helped me to think through the issues here. If you happen to be interested in a somewhat parallel situation, you might enjoy this article that describes a cyclical water collection system based on collecting solar energy through an aerogel during the day, then using a condenser optically coupled to the dark sky at night: https://www.nature.com/articles/s41467-018-03162-7. Only loosely related to Munroe's hypothetical, but shows a real application of some of the same concepts.
Edit: I just noticed a link on the second page of this thread that had some useful discussion that might interest you: https://physics.stackexchange.com/questions/370446/is-randal.... If you expand out all the comments on the answer, Shor seems to be making the same argument you are, and Lalinský is making a better version of mine.
The mirror is a part of you hybrid reflective/refractive light concentration system. Of course it stays cool, otherwise your system is inefficient.
Doesn't the mirror stay cool? Isn't the moon just a (poor) mirror for the sun's light?
Precisely. So the maximum temperature that system can achieve is much less that your mirror/magnifying glass system. A perfect black body radiator is just about the poorest mirror there is. The moon's light isn't what you'd get from a perfect black body. However, it's a long, long ways away from being a perfect mirror and probably far closer to being a black body.
Why? If the moon were a perfect black body, but with a few small perfect mirrors on the surface to bounce a bit of sunlight towards Earth, couldn't we (in theory) focus that tiny bit of sunlight to heat something up to the temperature of the Sun? There may not be enough solar radiation to make it feasible in practice, but it seems like you need to actually do the math instead of just saying "the moon's surface temperature isn't high enough to start a fire, Q.E.D."
To get past the diffraction limits, those "few small perfect mirrors" on the Moon would have to be gargantuan. (If I'm remembering correctly, they'd be the size of the alien space ships in Star Trek that make the Enterprise-D look teeny.) Otherwise, viewed from the surface of the Earth, those mirrors would just constitute a minuscule brightening of the sky in the direction of the Moon.
Are you saying that mirrors do something other than absorbing and re-radiating light? In most explanations I've seen, this is exactly what they do:
How does the mirror reflect light? The silver atoms behind the glass absorb the photons of incoming light energy and become excited. But that makes them unstable, so they try to become stable again by getting rid of the extra energy—and they do that by giving off some more photons. (You can read about how atoms take in and give out photons in our article about light.) The back of a mirror is usually covered with some sort of darkly colored, protective material to stop the silver coating from getting scratched, and also to reduce the risk of any light seeping through from behind. Silver reflects light better than almost anything else and that's because it gives off almost as many photons of light as fall on it in the first place. The photons that come out of the mirror are pretty much the same as the ones that go into it.
Quick searching of how the astronauts survived this, turns out it seems they timed landings to the lunar dawn for an in-between temperature, that the lunar surface doesn't conduct heat well (all dust?), of course there's no atomsphere to conduct heat, and that their boots were extremely well-insulated.
There's a pretty good song about how the moon's day/night cycle would affect a lunar mining colony: https://www.youtube.com/watch?v=GDPUdUGJpjc
This is one of the reasons why lunar bases will actually be more difficult than people think. Dealing with extreme environmental conditions is much easier when they are stable, then you can design around them and deal with them. Dealing with conditions that cycle from one extreme to another continuously over extended periods of time is much more challenging. One point of proof of that is the longevity of the Martian and lunar rovers and landers. Many landers and rovers on Mars have lasted for years, some have lasted over a decade. No lunar rover has been able to maintain roving operations longer than a few months. This despite the fact that lunar rovers are in near real-time contact with Earth continuously. The heat/cold cycles and hyper-abrasive clingy dust make the lunar environment particularly harsh on equipment.
The rest of the argument seems to be that light cannot be optically condensed to a single point as there will always be some dispersion due to diffraction, and the size and shape of the dispersion is dictated by that of the source. That is you can't make the target denser then source, hence the temperature must be less.
EDIT: Several commenters submitted that lenses are reversible while solar panels are not, and this makes all the difference. My retort is that I can make non-reversible lenses by covering them in a thin layer of dust. Since the lense system is now non-reversible can I use these sub-par lenses to create higher temperature than I could with clear lenses?
There's a difference. By going through solar panels you would increase entropy because the conversion of sunlight to electricity can't be 100% efficient. And if you allow the solar panels to heat up too much because of that waste heat, then that efficiency will drop further.
If a lens could heat a point to a temperature higher than the sun, then there'd be no such loss and you'd be magically decreasing entropy.
Were is this reasoning wrong?
This boils down to a moderately heated BB receiving an large stream of moderately powered photons and either rejecting them or first absorbing and then radiating them away at the same pace without changing its own temperature, regardless of how many photons are coming in.
I think that this issue is addressed by footnote 2
> And, more specifically, everything [lenses and mirrors] do is fully reversible—which means you can add them in without increasing the entropy of the system.
Presumably we can't say the same of the electrical devices that we use to collect, store, and transmit sunlight, or to create high temperatures from this stored energy. For example, if you use a "lunar panel" to store energy in a battery and then heat something on an electric stove, many of the components in this process will not be reversible, differently from mirrors and lenses. (Putting a hot object on top of the stove won't cause the lunar panel to emit light back in the direction of the moon!)
So I think footnote 2 is actually very important, because it's not that we can never use any energy source to create something hotter than that source, it's that we can never do so using only reversible processes, including purely passive optics.
I guarantee you this can be done. Arc welding  goes to many thousands of degrees, up to 20,000C. A square mile of solar panels will most certainly give me enough energy to power an arc welder. In fact I will probably will get away with just 100kw of pwoer, so 500 panels give or take.
The solar panel system is different - it's a heat engine that takes Q units of heat energy from one body, transfers a portion of it Q₁ to a higher temperature body and transfers the remainder of it (Q - Q₁) to a lower temperature body (the immediate surroundings). That's OK with thermodynamics, subject to upper bounds on the ratio Q₁/Q that depend on the absolute temperatures of the bodies involved.
This suffices to show that the perfect lens system can't work. The dusty lens system can't work because adding dust to a perfect lens doesn't turn it into a heat engine.
Except it's not "heat energy", it's electromagnetic energy that's getting transferred. If the target object is smaller than the source object, you can have an amount of energy that will increase the target's heat above that of the source.
Thermal radiation == electromagnetic radiation.
It's okay because solar panels are not perfectly efficient while lenses are, in theory, perfectly efficient. That efficiency loss is in essence the "cost" of moving heat from a colder to a hotter place. I'm guessing, but don't quote me on it, that solar panel efficiency is related to the temperature of the sun and the local environment just like any other heat engine.
You are correct,
(1) can't heat an object to a higher temperature than a source object using lenses - you can't transfer heat from a cooler object to a warmer object without work and lenses don't provide a mechanism for work. This still leaves room for reflected sunlight from moon albedo getting something hot enough to burn, at least from a thermodynamic perspective.
(2) you can get something arbitrarily hot using lunar panels, this is just a bad heat pump (max COP 1) where you feed work generated by the solar panels into a perfectly thermally isolated system with a resistive heater, and we could do this a bit more efficiently with a true heat pump.
I am dumping some amount of low-energy photons onto a target, the target is heated up and radiating out the same amount of energy it receives. As I increase the number of photons hitting the target its temperature raises and it radiates more heat outwards, shedding excess heat.
Then as I keep increasing the number of photons, the temperature stops increasing at some point having reached the temperature of the photon source. I increase the number of photons a million times of the previous level, yet the object remains the same temperature.
Where does the excess energy go? Is it not absorbed by the target? Is it reflected? Is it re-radiated?
Initially it's in a steady state where the target is colder.
You increase the number of photons hitting the target. The target gets hotter. All good so far.
But we have to look at how you increased that number. You need to either move the source and target closer together, or you have to use lenses to simulate the same thing.
You can keep hitting the target with photons from more and more directions, but no system of mirrors or lenses can increase the apparent brightness of the source.
As you approach the point where the source fills the entire sky, the target will approach the same temperature.
And then you can't increase the number of photons any more. Not without using a hotter source.
There is a limit to how small you can focus a lens. Once the object you are focusing is in sharp focus, it will start to go out of focus again if you go past that point. So you can’t just arbitrarily increase the photon density further.
If you make your lens bigger to gather more photons, then the focused image just grows by a corresponding amount.
Optics (and thermodynamics) tell us that we can not generate a higher flux of photons into a target area, than left the equivalent area at the source. Assuming that both the source and target are black body radiators, that says the target can’t get hotter than the source.
Say a light source has a T temperature resulting in X photons emitted. I redirect all the photons to a single point. I see arguments mentioning that that single point cannot be hotter than the source because there's no more photons to make it hotter.
I now add a second light source of the same T temperature, that emits the same amount of photons and also focus all of them on that same point. I now have more photons, but temperature source of all photons is the same. How does adding more photons not make my point hotter?
For that to be possible with a blackbody light source, it has to itself be a single point. Which means a temperature of approximately infinity.
For a real light source, one that has an area, you can at best focus it down to the same area. To get maximum light to a target, you either have to make the target almost touch the source, or you have to have it so no matter what direction you go from the target, you hit a lens that's bringing light from the source. Once you set either one of those up, there's nowhere to fit a new light source.
I'm not an optics expert, but this can't be true. You can clearly focus light emitting from an area into a smaller area (although probably not to an infinitesimally small area).
You can't increase the density of the light. You can't focus all of it onto a smaller area.
Is that clearer?
If that is the case, then there is no extra room to fit the optics to focus another source onto the target.
If you shrink the optics enough to make room for another source, then you also aren’t delivering all of the photons from the original source (you can’t be, since you aren’t covering all incoming angles), and therefore it’s not the same temperature as the source.
What about two single photon sources, can’t they be pointed at exactly the same spot? Maybe the explanation here is that the target electron cannot interact with 2 photons at the same time, so you can’t ‘double heat’ a single particle. Or maybe that you can’t precisely target a single particle without decreasing the entropy of a closed system, which is impossible.
Here he says very clearly that if you "bundled" all the light from the sun and aim it at the earth it would heat the atmosphere to millions of degrees (the surface of the sun is much less than that). It's not at all clear to me what he means by "bundled" and why it's not contradictory to what he says in the article here. Presumably some kind of lens / mirror system could be used?
It seems to me that in this article he has in mind some highly abstract system that's fully reversible. Of course, in that case, once the target object gets hot enough it will start emitting light and result in equilibrium. But it's not clear to me that this describes what would actually happen with a real optical system! E.g. (a) much of the light the target receives is going to be absorbed and reemitted away from the lens, (b) what if you removed the lens targeting system at the precise moment the light impacted the target, so that the system couldn't be reversed, etc.
Edit: one more thing. The surface of the lit side of the moon can reach 260 degrees F, and dry wood can potentially catch fire as low as 300 degrees F. And the moon has some reflectivity as well. So even taking Randall's claims on their face, I'm skeptical that you could not start a fire (in some materials at least) using moonlight.
The reason I (and probably others) find Randall's explanation unhelpful is that obviously there's "enough" energy being reflected by the moon to start a fire (that's why people keep bringing up solar panels). The issue is that there's no way to optically redirect that energy into a small area without heating up your source to the same degree. Which is theoretically possible I suppose, but it's not the situation the What-If is talking about. Along with the issue that the light we see from the moon is mostly reflected rather than emitted (which changed the situation entirely), this makes the What-If explanation a little misleading.
Yes, yes I can. 1000 solar panels (200 watts each) will power a an electric arc welder up to 20,000 C continuously for as along as the sun shines in the sky.
There is nothing here that collects energy across time, all energy is immediately imparted onto the electric arc.
The solar panel itself is already non-reversible, and so is the arc welder, right? That seems to be the important difference from the lens.
In that sense, we don't run either of them "for free" in the way that the xkcd piece describes.
Are you sure about that? First off, take an off-the-shelf solar panel, point it at the moon in the middle of the night. You get a grand total of nothing. Okay, it was a cheap panel, that might not generalize to anything.
But more importantly, by the argument laid out in the article, your solar panels cannot work in moon light. (Or maybe they work at horrible efficiency, because the moon is a bit warmer than the earth.) I'm not sure I buy that argument; maybe you should run the experiment.
Yes, this is what would happen.
Solar panels will work in moonlight if and only if you can make fire from moonlight with a magnifying glass.
The answer depends on how good a mirror the moon is. It calls for a real experiment, not a thought experiment. I don't really know which way it will go.
I put in place 1000 solar panels and aim them at the sun thus procuring 200kw of power. Next I use it to power an arc welder, producing 10,000 C of heat. Therefore I have used a 6,000C sun to produce 10,000C on earth.
Would a similar setup work with the moon? I don't know. That was not my point, my point was that it certainly is possible to produce higher temperature at the target than it was at source.
The area argument is more important here because the lens cannot increase the number of photons per second, but it can decrease the area. If F = N / (t * A) where N is the number of photons, t is time, and A is area, the lens can change the area, but not to 0. And if you need a sufficiently high F to get to the right temperature, the only way to get there with limited N is to bring A sufficiently close to 0.
If you have multiple magnifying glasses and mirrors I am fairly certain that you can. That is the equivalent of using a set of solar panels that power a laser. But that was not what was postulated in the original thought experiment, so it does not apply.
I am still fuzzy on the thermodynamic argument, but I was never good at intuiting thermodynamics. The argument presented is that if you have one body at 100 degrees C, and you put another body next to it, you cannot make the second body hotter than the first. That makes sense. But if the first body is constantly generating and transferring heat to the second with at most 100 degrees C temperature, and the second body has some way to store heat energy, then it is possible to heat a local area of the second body to higher than 100 degrees C. The storage of energy here is what I think counts.
Of course you could put a solar panel connected to a battery in moonlight for a few months and build up enough stored energy to power a laser for long enough to fry something. But that’s not “burning something with moonlight using a magnifying glass.”
Reading this is is equivalent to reading a thread on a physics forum where with people arguing about an article saying that O(n*lgn) is really the best possible runtime complexity for a comparison-based sorting algorithm, and trying to disprove it.
It's also worth pointing out that Randall Munroe is a physicist (to the extent an undergrad degree counts anyway).
I feel much more frustration when arguing about impractical engineering proposals (e.g. solar roadways, waterseer, hyperloop, or the ocean cleanup project), since people seem to have a much more biased drive to believe in their feasibility, and can't really be reasoned with.
It's interesting what bothers different people. While many of the statements in this thread are probably wrong, not many of them bother me. But I find the lack-of-self-doubt and appeal-to-authority in your message to be genuinely offensive. Where does your certainty come from?
With that out of the way, could you give some links to well known arguments that you refer to? Specifically, I feel certain that one can start a fire with sunlight reflected from a room temperature mirror, and don't understand the difference between a mirror and the moon within Munroe's argument.
His conclusion might be correct (in practice, you may not be able to concentrate moonlight enough to start a fire) but I don't think the details of his argument can be. I currently don't believe that the temperature of the reflecting surface can be the limiting factor, and I think this is central his argument.
A mirror does specular reflection and thus conserves the etendue of the sunlight. You're concentrating the image of the sun in the mirror, not light from the mirror itself.
The moon in contrast is mostly a diffuse reflector - it scatters most of the light that falls on it (and absorbs and re-emits most of the rest), so it is effectively a new light source.
This is true for a black body, but is it always true for an object being illuminated by another? I'm don't know that we can consider a diffuse reflector with a surface temperature of 100C as being equivalent to a black body with temperature of 100C. I think his conclusion is likely true (the moon is too dim to start a fire even with a really big magnifying glass) but I don't think he's right to point to the surface temperature of the moon as being the evidence of this conclusion.
Assume the sun was much brighter, so that ignition on earth is possible with a sufficiently large magnifier. Presumably if the moon was the same, this would mean the surface temperature of the moon was much higher. Now change the moon to be more heat conductive (causing the surface temperature to drop due to more heat loss on the dark side), and more reflective (causing the surface temperature to drop further due to less absorption). I'd guess that if you tweak the parameters sufficiently, you could end up with a surface temperature low enough that Munroe's argument would say that ignition is impossible, even though we've increased the intensity of the moonlight over our baseline.
How does Munroe know that we aren't in that second regime? I don't think there's enough information in his argument to distinguish. Alternatively stated, we know that there is some current temperature to which we can heat an object using concentrated moonlight. We also know that if we can change the shape and composition of the moon, we can reduce the surface temperature without reducing the intensity of moonlight. Unless there is some limit to the effectiveness of the heatsink that we can put on the moon, I think this means there is some possible arrangement that violates the assumption that the surface temperature must always exceed the temperature achievable with a magnifying glass.
(Thanks for helping me to puzzle this out)
>Now change the moon to be more heat conductive (causing the surface temperature to drop due to more heat loss on the dark side),
Yeah, although this isn't too big in the case of the moon (unlike the Earth, it doesn't have an atmosphere and doesn't rotate rapidly, so there isn't too much redistribution of heat across its surface), it is definitely something that would confound the calculations. We'd still be able to just look at the effective temperature of light falling onto the moon and that would limit the temperature that we could light the object up to. But we wouldn't be able to use a direct measurement of the temperature of the surface of the moon.
> and more reflective (causing the surface temperature to drop further due to less absorption).
To the extent that it's a gray body (and most objects are approximately graybodies), this wouldn't actually lower the temperature. Absorptivity < 1 causes it to absorb less energy from the light, but for a gray body emmisivity equals absorptivity so it also radiates out less light too, and you actually end up reaching the same equillibrium temperature as fully absorptive black body.
I respond late to offer a link (that's hidden on the second page of this thread) that I think presents that argument I was trying to make better than I managed to: https://physics.stackexchange.com/questions/370446/is-randal.... I thought the comments on the answer were a helpful reframing of the problem.
The nice part about physics though (like Munroe, I was also an undergraduate physics major) is that in simple cases like this, a suitable expert can usually defend their position with an argument comprehensible to a nonspecialist outsider. My doubt in this case is not that the experts are wrong, but that the experts haven't actually looked at the details of Munroe's argument and stamped it as "approved".
I think the part I find "offensive" is that CydeWeys is not claiming to be an expert himself, but is claiming to have certainty in what the experts believe. I don't know exactly why I find this offensive, but I do. And yes, this may be a problem with me, and not with CydeWeys' argument. I would not be offended in the same way by someone claiming "I am an expert and I approve this argument". Still, my question to him is genuine: what gives him this certainty?
My version of the argument involves turning it into a perpetual motion machine.
On the other hand, this rule is incredibly useful in modeling energy transfer through optical systems, seeing what things are possible and what things aren't without having to make detailed calculations. It's a real world rule with useful applications.
Instead, I'll just spout nonsense about programming. ;-)
So you should really be asking for a communications expert, not a physicist.
He's trying to explain everything in terms of a simple blackbody in thermal equilibrium, peacefully radiating its energy away only via thermal photons. That's not the reality of the radiation from the sun or moon. Solar physics is an entire branch of physics, and such simple toy models are not even wrong.
Sun doesn't just radiate away its existing energy via thermal photons.
First, it keeps burning its fuel via a series of nuclear reactions, which by the way keeps pumping energy into the system, essentially acting like a battery (so there's no perpetual motion here).
Second, sun emits photons that are much more energetic than the thermal photons from the surface. Some of the radiation is not thermal, and comes directly from different types of nuclear reactions (which provides signatures regarding the kind of reactions happening in the sun) and various other processes.
It was just an option on my MSc in LASERs but I thought it was cool (with the potential to be very warm).
Although my frequency has been halved and I've been working in software for decades
But the moon is not blackbody, and I think the whole argument falls apart. Here’s a thought experiment: go stand on the moon, and assume the moon is made of rock that diffusely reflects, say, half of the indicent 500nm light. Stand somewhere that’s in shadow, so you can’t see the sun. Wrap a piece of paper and some air in perfectly insulating, perfectly reflecting material, except that the material lets 100% of 499-501nm light through, but only on the moon side. The target will be in a bath of 499-501nm light at 1/2 the intensity (energy density per unit volume) of the sun, which is far more than half the temperature of the sun. It’ll catch fire after a while.
Now do the same experiment on the Earth, at night, with lenses to bathe it in moonlight from all sides. Fire! So I claim that lenses+mirrors+filters can start a fire with moonlight.
Another interesting question: can you use a luminescent solar concentrator or other fluorescent material to pull this off without taking such egregious advantage of the spectrum of moonlight? These types of materials can violate conservation of étendue.
Unconcentrated sunlight is slightly under 1400 watts per square meter. It's equivalent to a temperature of 122C.
You can concentrate sunlight coming from the sun, because the sun only fills five millionths of the sky. With a simple lens you can focus hundreds of megawatts per square meter onto a surface.
But once you bounce that light off a diffuse surface, whatever concentration you had becomes the new maximum.
In your experiment, bathing something in moonlight would max out at 700 watts per square meter.
700 watts per square meter doesn't set things on fire. It can only heat a blackbody to 60 degrees C.
Even the full brunt of unaltered sunlight can only bring a blackbody up to 122C.
Treating the moon as a blackbody or not doesn't actually change the equations. The important property is that it diffuses light. It resets your maximum concentration of light, because light that comes evenly from every direction can't be concentrated.
(I'm ignoring the part about wavelength filtering because it's confusing and would only make your piece of paper heat up less.)
> (I'm ignoring the part about wavelength filtering because it's confusing and would only make your piece of paper heat up less.)
I’m afraid you’re ignoring the interesting bit. You say that 700 W/m^2 doesn’t set things on fire. This is not true. Sure, 700 W/m^2 applied to some target that is allowed to radiate its own blackbody light out to the sky won’t get it very hot, but that’s not what I’m suggesting. I’m suggesting that you insulate the target very well so that its blackbody emissions don’t escape, but you let in the short-wavelength moonlight. Thermodynamics requires that you also let out the short wavelength blackbody emissions, but those are negligible until the target gets very, very hot.
This effect isn’t science fiction — it’s just the greenhouse effect, amplified. Greenhouses (the glass ones and the atmospheric ones) exploit the fact that sunlight doesn’t match the Earth’s blackbody spectrum, so a filter (glass or gaseous) can allow incoming radiation in but trap most outgoing radiation.
In effect, I’m suggesting that a very good greenhouse plus some lenses could get hot enough to start a fire.
But the conclusion of the article is wrong. The surface temperature of the Moon has very little to do with the wavelength of the reflected photons.
Consider a surface covered with ideal tiny mirrors each pointing to random direction. Only a tiny proportion of these mirrors will reflect light from the Sun towards observer. Now consider that 90% of those mirrors are painted black reducing the reflected enrrgy flux by further factor of ten. The Moon is like that.
Still the reflected light has original wavelength of the light of the Sun. Collect enough of it and that triggers fire.
> Still the reflected light has original wavelength of the light of the Sun. Collect enough of it and that triggers fire.
You can't collect enough of it into one place with a passive optical system, because it's been irreversibly scattered by the moon's surface (Read: irreversible increase of https://en.wikipedia.org/wiki/Etendue).
Essentially an optical system will bring Moon's surface closer, but even if it brings the surface within 1 cm from the wood, the defused Sun light scattered from that surface is not enough to ignite the fire.
Sun's temperature is 6000 K. Moon's surface is pretty black: it reflects only 12% of the light.
So, effective temperature of the Sun reflected by Moon, considering that thermal radiation is proportional to T^4, is 6000 * .12 ^ (1/4) ~= 3500 K. That's quite enough to light up some fire! Of course, the spectral composition of the light will be not thermal etc, but the estimate should be close enough.
Why doesn't the Moon itself heat up like that? Well, the rocks on Earth don't heat up to 6000 K either... I think, it's partly that they are "not surrounded by sun", partly that the Moon is a giant cold heatsink
For example, the article states "In other words, all a lens system can do is make every line of sight end on the surface of a light source, which is equivalent to making the light source surround the target."
If you forget about optics for a second, imagine that the outer surface of the sun were wrapped around a point (think of the image shown in the article). If you consider conservation of energy for the energy flux from the surface of the sun being entirely directed to a single body of matter that absorbs this heat (assume it's a penny), the steady-state blackbody emission of the penny would have to equal the energy flux from the entire surface of the sun. I think this situation would end up making the 'temperature' of the penny much higher than the surface of the sun for the same reason that the center of the sun is hotter than the surface: There is energy expended, and it comes from the fusion of light elements inside the sun, so there is no violation of entropy as stated in the article: "you'd be making heat flow from a colder place to a hotter place without expending energy."
The sun is a blackbody, it radiates light because it's hot. Once the penny is the same temperature it will radiate back at the sun until the two are in equilibrium.
If you add more sun, you add gaps, and the amount of light hitting the penny stays constant.
It never has more than one penny-surface-area of light hitting it, so it never has to get hotter than the sun.
This is a tough tricky thermodynamics question because it really seems like you can use a really big magnifying glass to make say an object that is hotter than the sun. I remember working through it in a physics class.
Ultimately the outside of this inverted sun system needs to vent the whole of the energy, the energy can't just fall inwards.
What you're describing is a reversal though; it requires the penny to be much hotter than the sun, only as a result of the sun's energy flowing into it.
How are you violating conservation of energy if you're taking all of the light that would hit a square mile of the earth and concentrating it down to the size of a penny?
If you can't concentrate light that way then how do focusing lenses on cutting lasers function? Makes no sense.
When you focus a cutting laser, you are imaging the shape of the laser cavity. The emission is coming from a very narrow spatial region, so you can focus it back down to a small spot. However, a laser that is out of alignment, will often not be able to be focused to a small spot.
I don't think this changes the ultimate answer to the question.
So yes, you can have a bunch of lenses each focusing from a different direction.
And if you do a good job of aligning them, each lens will look as bright as the sun/moon from the target.
But that's your limit.
Imagine you had a large cloud of planar mirrors, each, specifically can be aimed at any given point --even points that overlap.
While I agree that you cannot focus a whole image to a smaller area than the diffraction limit allows for a continuous lens surface, if you omit diffraction, mirrors could certainly do it.
Wait a minute. Does that mean that I could get a tiny solar panel and light it up with a lens, instead of getting big panels? Energy output of a panel is proportional to the amount of light that hits it, right?
I guess that lenses are ‘free’ only if they have no impurities, but even then, assuming solar panels are costlier than plastic lenses, I could save money.
Come to think of it, how is it that I haven't seen or heard about parabolic reflectors with solar panels in the focal point? Right now I've found an article about parabolic troughs that are apparently used to heat old-school fluids instead: https://en.wikipedia.org/wiki/Parabolic_trough
So you're saying I should cool them with water and use the steam to move turbines, then I'm golden!
\($ ∇ $ )/
Doesn't appear that many substances have auto ignition as low as 100c.
If you're hunting for easy to ignite stuff, it might be better to go for low flash point stuff, and strike a spark?
Eg gasoline will work in pretty cold environments with a flashpoint of - 43c.
It struck me as very similar to the setup in this video.
At about 4 minutes in, a nearly identical setup is shown with a beam of light emerging from a block of opaque material using holographic techniques.
It seems plausible to me that a specially designed anti-moon hologram could allow reconstruction of the incident light from the sun, thus allowing a fire to be started without violating any law of thermodynamics.
They take the situation to its absurd conclusion, then quit with a full finality that people take as truth. Further absurd conclusions are ignored, and their word is law.
Some more absurd arguments for why you will be able to light a fire with a magnifier and moonlight are as follows:
1: use a pre-magnifier to create a spot on the moon with the same temperature as the sun, then magnify the light from that spot to start your fire.
2: wait a long time... eventually a meteor will hit the moon creating a spot bright enough to focus.
3: wait even longer...
Eventually the random mollecular collisions between the wood and the (presumably oxegen rich) air around it will convert it into carbon dioxide and water while serendipitous individual high energy blackbody photons help break it down.
4... insert absurd^4 answer that brings in tunneling effects, or moving mirrors, or some other "impossible" reason that is only impossible because they didn't think of it for you.
Suppose we build a 100 foot mirror which reflects 10% of sunlight, such that the spectrum remains the same. We could still make a fire with the reflected light, if we just gather 10 times more of it with a larger lens. We could do this in the Arctic, with the mirror's temperature at below zero; the mirror's temperature is irrelevant.
Note that the article claims that no matter how much moonlight we are able to gather (i.e. we are allowed to overcome the inverse square law however much we want) we cannot create a temperature that will ignite paper.
"The scientific principles of the convergence and refraction of light are very confusing, and quite frankly I can't make head or tail of them, even when my friend Dr. Lorenz explains them to me. But they made perfect sense to Violet."
Violet Baudelaire goes on to use the scientific principles of the convergence and refraction of light to set fire to a piece of sail cloth using only moonlight and the lens from a spying glass in "The Wide Window" by Lemony Snicket.
It's possible that this book is a work of fiction.
Now focus the light back onto just a small region of the brick pile, and heating up that region, which in turn heats the rest of the bricks by conduction.
This in turn increases the amount of heat collected from the pile, ad infinitum or until the brick pile melts.
Short of letting the brick pile melt itself, imagine tapping into the excess heat and using it to power an electric motor for a useful purpose.
The use case would be materials that don't burn at the Earth's surface temperature, but do burn at the Moon's peak surface temperature. But you could probably get those hot enough just by rubbing them or something.
Combined with the lens, this might work.
I think one thing which would help me develop an intuition for it would be to see the calculation of the lens size for heating a one square-centimeter area on the earth to as high a temperature as possible by the light of the moon, and what that optimal temperature is.
Anyone reading for whom this is straightforward? Even a description of how to do the calculation would go a long way.
For a single lens you just want something really big. Make it take up 90+% of the sky from your target.
For temperature, the no-calculation way is to measure a rock on the moon (article says 100C) and use that number for how hot you should be able to get.
The calculation way goes as follows: Near earth you get 1400 watts per square meter of sunlight, so if that bounces perfectly off the center of a full moon and gets through the atmosphere with no losses, your target will get 1400 watts per square meter. That's equal to a black body at 122C. After taking into account the spherical shape and atmospheric losses you might get less than half of that, so ambient heat might drown out your results.
You can make an enormous lens where all the energy coming off a particular acre of moon goes through it. But you can't focus all of it onto a spot smaller than an acre, no matter what you do.
The current top comment here seems to agree with my original intuition, though.
This is the optical argument for maximum concentration given conservation of étendue (the same page has an optical argument for the conservation):
The angle subtended by the moon is approximately 0.54 degrees, or about 0.01 radians, so the maximum concentration factor is about 10,000. The moon provides about 0.1 lux of illumination, so the maximum illumination you can achieve by optics is about 1000 lux. The sun subtends about the same angle, and we receive 30,000-100,000 lux from it, and the maximum illumination you can achieve from concentrating it optically is about 1B lux.
I'm willing to believe that you can't light a fire from the moon, if the intensity's a million times lower.
You could reproduce any fixed black body spectrum (to arbitrary accuracy) from a set of thermal sources and filters (or a set of lasers, LEDs, etc. with random phases) to arbitrary fluxes just like a laser has, and use this light to heat objects to arbitrary temperature. But if the original emission is of black-body type, you cannot -- the flux is given by the quantum mechanical process and a function of local temperature only. From then it follows from etendue conservation you cannot achieve higher temperatures.
it's like there's a brain bug where engineers think they can outsmart 200+ years of scientific progress with a clever arrangement of mirrors.
What exactly is the "this" that you refer to? I don't think the issue is that the "engineers" disagree with the physical principles, rather they tend to disagree that the physical principles apply in quite the way that the author claims. Many of the engineers probably believe is that Munroe is correct in claiming that on cannot start a fire with a low temperature blackbody radiation source regardless of the size of one's magnifying glass, but disagree that it is reasonable to consider sunlight reflected by the moon as fitting this model.
Presumably, you agree that it's possible to start a fire using a magnifying glass using sunlight on earth. I'd guess you also believe that it's possible reflect the light from small handheld mirror into the magnifying glass, and still start a match, even though the mirror is much lower than the temperature of the sun? While the specular reflection from the mirror is different than the diffuse reflection from the moon, one might note that the words "specular" and "diffuse" don't appear in Munroe's exposition. Would you agree that Munroe's argument would appear to prohibit this behavior?
Now assume that the moon was replaced by an equally sized parabolic mirror aimed to be focused on the earth. Would it be possible to light a match using this light if one's magnifying glass was large enough? Which thermodynamical principle am I violating in thinking that it might be possible? And which part of Munroe's argument do I invalidate by making these modifications? Again, my point isn't that Munroe's conclusion is wrong, just that there might be something flawed about the argument he uses to reach that conclusion. This might be a "brain bug", but from the inside it just feels like an attempt to understand truth.
and I agree it's counterintuitive and it's OK for people to come up with ideas, but when you hit the point of "hey, the thermo people have a nice collection of proofs demonstrating this, and it fits very well with the underlying theory, oh, and if you do manage to violate etendue, you could probably build a perpetual motion machine", if you willingly continue to argue and get shot down, it's time to go back and re-read the books.
BTW, what's your obsession with Monroe? What he's referring to is a scientific phenomenon, Monroe is just a science popularizer, and if he got the details wrong- well, the point of xkcds like that is more to inspire people with ideas, than get the exact details right.
Great principle. I agree with it, and think that most of the people offering objections here do as well. Our question is whether it's it's being applied correctly in this case. For me, the sticking point is whether surface temperature can be used as a proxy for the brightness we care about.
BTW, what's your obsession with Munroe?
I have none. He's a great explainer, probably occasionally gets details wrong, and (so far as I can tell) is a positive force for spreading scientific understanding. My "obsession" seems to be that I have poor tolerance for overly broad smackdowns of critics. If you are going to tell someone else that they are wrong (as opposed to Munroe who is trying to explain what he believes is true) I think you have a higher obligation to get all the details right. I presume I'm sensitive to it because I'm often on the receiving end.
How about you? Why does it bother you so much that some people say that Munroe's argument is logically flawed?
If not, at what percent totality does it become impossible?
Am I to accept that the additional magnifying glasses would cease increasing the temperature once the temperature matched that of the moon's surface?
For example, 1/2, 3/4, 7/8, 15/16, ...
With the addition of some well placed mirrors though, that's possible.