It's the radiance (brightness) of the moon (as perceived at the moon) that is an absolute limit on the incoming irradiance you can create, because of conservation of etendue.
Separately, the fact that moon rocks (which experience that exact amount of irradiance) reach some given temperature X°C while losing very little heat to conduction (certainly less than an object on earth would), shows that this level of irradiance is insufficient to heat up your kindling above X°C.
This argument has nothing to do with the moon being a black body, or an approximate black body, or any such thing. Just the fact that moon rocks, which are exposed to this light (with very little heat conductive losses), function as a kind of a thermometer which tells you how hot that light can make something; and the answer is "a bit over 100°C"
> Since the moon absorbs about 90% of the light incident on it, we can assume that the surface temperature is lower than it would be if it was a perfect black body, presumably reaching a temperature corresponding to a sun that was about 10% less strong.
No... An object with 90% absorbance absorbs 10% less light energy, yes. But it also emits 10% less light energy, so the two effects cancel out and it reaches the same equilibrium temperature as a black body.
Well yes, this is what it should say! Is this our disagreement? Because for me (and I think for most other dissenters in this thread) the whole problem we have with Munroe's argument is that he keeps coming back to the surface temperature of the moon as the limiting factor. If he was simply to say that the moon is not bright enough, then we'd probably all agree.
the fact that moon rocks ... reach some given temperature X°C ... shows that this level of irradiance is insufficient to heat up your kindling above X°C.
I think this is the real point of dispute. We agree that this is true if we are only considering pure blackbody radiation. What's not clear (at least to me) is that this equivalence is still true when you include the directly reflected light. That is, no one thinks that you can start a fire using only the thermal infrared light from a dark moon. The question is whether it's hypothetically possible with a sufficiently bright sun and sufficiently reflective moon, without raising the surface temperature. Can you point to something that makes this argument more directly?
the two effects cancel out and it reaches the same equilibrium temperature as a black body
I need to learn more about this. I have trouble thinking it applies correctly here, because it's assuming the moon is a perfect gray body. I think this assumption falls apart if it's actually reflecting light, which in fact we know it is. Or am I wrong? Does a silver mirror in space actually end up at the same equilibrium temperature as a lump of coal? I guess it could. This wouldn't harm much argument (the argument just requires that the temperature not increase), but would indicate that I'm not viewing things correctly.
Summarizing, I think the point of dispute is whether the surface temperature of an object in space can always be reasonably estimated from its brightness (and vice versa). We agree that it can be if it's a perfect black body. We agree that it's mathematically true if it's a "gray body". We disagree (I think) as to whether it's appropriate to make the simplification of assuming that all stellar objects are sufficiently close to "gray bodies" for the math to hold.
 Here's the outline of the counterargument. Start with a blackbody moon. Estimate that with perfect optics you can heat an object to X. Now increase the reflectivity of the no-longer-black-body, noting that the surface temperature does not increase. I'd argue that when you increase the reflectivity, the moon gets brighter, and thus you can heat your object to a higher temperature. You seem to be arguing that because the surface temperature remains the same, the attainable heat stays the same, even though you can collect more reflected energy.
Yes. It takes longer to reach the equilibrium, but it does reach the same equilibrium temperature.
>  Here's the outline of the counterargument. Start with a blackbody moon. Estimate that with perfect optics you can heat an object to X. Now increase the reflectivity of the no-longer-black-body, noting that the surface temperature does not increase.
Right, so, let's say the total solar power received by the blackbody moon is Ps. The moon increases in temperature until it reaches temperature Tb, at which the power emitted equals the power absorbed Pb = Ps. That is the equilibrium.
Now let's let the moon reflect (scattering) some light, by giving it a realistic albedo of 10%. Then of the solar power received at the moon, 0.9Ps is absorbed and 0.1Ps is scattered. But this albedo also causes the moon to emit 10% less light, so it now emits 0.9Pb at the same temperature Tb, hence it reaches equilibrium 0.9Ps = 0.9Pb at the same temperature.
Note that the total light energy leaving the moon is now: 0.9Pb (emitted thermal) + 0.1Ps (scattered) = 0.9Ps + 0.1Ps = Ps. Just the same as the amount that was emitted as a blackbody. So in fact it's not any brighter (in terms of power), just has a different spectrum.
The first is that the applicable "mirror" question here whether the surface temperature of the sunlit side of a slow rotating astronomical body is independent of reflectivity, not whether the eventual core temperature of a uniformly lit body is independent. Are you confident that the same reasoning applies? I'm not yet.
Next, if we are phrasing our question as to whether one can start a fire with a magnifying glass, we clearly do care about the difference between visible sunlight and low temp infrared. For Munroe's argument to really work, the surface temperature needs to be proxy for the collectible visible light that would be used by a magnifying glass.
There's also the directionality: none of the visible light is going to be reflected toward the "dark" side. The thermal radiation is also directional, but not to the same extent. As we move toward greater reflectivity, presuming a bright full moon, we do get more of the total energy available on earth. How much more? I don't know.
Lastly, which we haven't discussed, there are options for insulating the heated object on earth that are optically transparent (to allow the concentrated light in) but infrared reflective (to prevent thermal radiation from escaping). I think this "privileges" the reflected sunlight so that we might indeed be able to achieve a higher temperature than the reflective moon surface in vacuum.
I feel like you recognize these factors also, by your caveats that you might be able to get "a little bit" higher than the surface temperature. Without committing to a number or methodology, your implication is that this "little" must is small relative to the surface temperature of the moon, rather than small relative to the optical temperature of the sun. While I agree that the surface temperature is related to the achievable temperature for collected reflected light, I still don't think that the exact temperature is a hard limit.
I'll try to bow out here and not take up more of your time. You've definitely helped me to think through the issues here. If you happen to be interested in a somewhat parallel situation, you might enjoy this article that describes a cyclical water collection system based on collecting solar energy through an aerogel during the day, then using a condenser optically coupled to the dark sky at night: https://www.nature.com/articles/s41467-018-03162-7. Only loosely related to Munroe's hypothetical, but shows a real application of some of the same concepts.
Edit: I just noticed a link on the second page of this thread that had some useful discussion that might interest you: https://physics.stackexchange.com/questions/370446/is-randal.... If you expand out all the comments on the answer, Shor seems to be making the same argument you are, and Lalinský is making a better version of mine.