Hacker News new | past | comments | ask | show | jobs | submit login
Can you use a magnifying glass and moonlight to light a fire? (2016) (xkcd.com)
340 points by btrask on Dec 21, 2018 | hide | past | favorite | 269 comments



Interesting article. Right in many places. Wrong (possibly) in main conclusion.

Entropy argument - correct in the sense that using radiation from black body we cannot use lenses to heat another body to the temperature higher than original. Easy to understand why - the first body has a temperature, radiation has the same temperature, if we apply the radiation to another object it will not heat up more than the radiation's temperature.

Also the argument about impossibility of concentrating light into a dot is correct (although even if it were possible we still would not be able to get higher temperature - light would not be energetic enough for that). The important part is - we could concentrate light into a dot only if it consist of parallel rays - i.e. only for an object that is infinitely far away.

Moon surface temperature argument is incorrect. A body at 100 degrees Celsius does not radiate in visible spectrum, so the light we see is not produced by Moon's temperature. It is reflected Sun light. So Moon's temperature doesn't matter. Moon surface does absorbs some light, changing spectral composition from about 5.7kK (Sun's surface temperature) to about 4kK. So we should consider moon to be a part of optics not emitter.

Hence the question is now - can we concentrate moon light enough so that intensity at the concentration point is higher than thermal loss into environment (only then we will be able to raise temperature in the concentration area enough for combustion - remember that light is "hot" enough for this)? I don't have answer for that - need to do calculations. What can be a deal breaker? Remember that Moon is much closer than Sun, so rays come to us even less parallel, so the area into which we can concentrate light reflected from the Moon is even larger than the Sun's, so together with lower intensity of light from Moon we might have trouble achieving the necessary intensity for combustion. However big enough lens probably will work.

And yes - I'm a physicist by training.


The conclusion is correct. It doesn't rely on the moon being a black body radiator, but on etendue, which shows that the most you can do with a system of lenses and mirrors is to create an environment (on earth) as bright as the environment on the moon. The fact that rocks on the moon's surface only reach 100°C shows that an environment like that is not bright enough to light a fire.


Trying to follow your argument. Suppose we put absolutely reflective mirror on the orbit. Since its absolutely reflective its temperature will be 0 Kelvin (in theory, very low in practice - there is a reason solar bound spacecrafts are covered with reflective surfaces). So we would not be able to use the light it reflects to heat anything?


Specular reflector preserves the etendue of the light rays that fall on it. Diffuse reflector (such as the moon) does not; it becomes a new light source instead, resulting in higher etendue.

In short - perfect reflector preserves etendue, but imperfect does not.


But the light coming from the moon is a combination of diffuse and specular reflection... and again (just like for the black body argument) the fraction of light which is specular is most important. The fact that the moon subtends 0.5deg out of 180, while reflecting more than 10% of light (12% bond albedo) suggests to me that what we see is not only lambertian (or cos2) diffuse reflection, but it depends on absorption so I don't know enough to calculate.


Whoa. I don't think most of the rest is specular reflection, since we never see any specular highlights on the moon, but if it is non-Lambertian, it might be possible to heat it up better than if the moon were actually a black-body or pefect Lambertian scatterer.

Thanks for pointing this out; I seem to have learnt of a new phenomenon I wasn't previously aware of: https://en.wikipedia.org/wiki/Opposition_surge


Actually, I was reading a very old article that some astronomers (lunonomers?) had seen intermittant specular reflections of small patches on the moon, with some calculations estimating size. https://doi.org/10.1016/0019-1035(68)90077-8

At this point I almost think an experimental measurement of temperature equilibrium would be publishable from one of the old re/frac/flectors you can put your head into.


Thanks, this is what I was missing and what I think is most obscure in the article!


The question is to what extend the moon surface is imperfect reflector. For example, if the mirror has a hole, it will be an imperfect reflector, but you can compensate for it using a bigger lense.


By imperfect it's sufficient to be diffuse. You can verify the Moon is diffusive to a good approximation because of the approximately uniform appearance of the full Moon.

The idea is that you can "organize" or "revert" any ray bundle from a system of non-absorbing lenses and specular reflectors, but if your reflector has billions of tiny irregularities it's not viable to build such a system (it's equivalent to an ideal diffuser, in which light is isotropically reflected). The ideal diffusion process is clearly not reversible by itself: if you shine a beam onto a diffuser it spreads the light; if you expose it to the same light (with reversed directions), it again diffuses it instead of reverting to the original beam. In theory again the physical laws of electromagnetism are time reversible, but in practice the effort to revert some systems might be too demanding (you can even do better -- see Maxwell's demon); manipulation of physical apparatus and information acquisition/manipulation itself has a cost that surpasses any gains.


The moon is very imperfect; it's darker than asphalt. http://wtamu.edu/~cbaird/sq/2015/08/06/why-is-the-moon-so-br...


Perfect reflector exists only in theory. But we still put (imperfect) reflective coating on spacecrafts.


Just a minor correction—I don’t see any reason why a perfect reflector would happen to be 0K, but since it has zero emissivity, you wouldn’t be able to measure the temperature. Which is not quite the same thing as 0K.

In practice, any reflector is imperfect, and therefore has nonzero emissivity, and therefore goes towards thermal equilibrium with its environment.


Except the moon is acting as a (partial) reflector, and so is part of the optical system. So, really, you're limited to the temperature of the Sun, not to that of the moon. So the conclusion is still potentially incorrect.


That’s really stretching “optical system” if you include the moon.

Sure, if we reshaped the moon into a giant mirror, we could use a lens light a fire using the “moon”.


First some light is reflected from the moon without increasing the temperature of moon rocks. Second, the moon landings avoided peak moon temperatures making the fact astronots survived irrelevant. Just look at how long their shadows where, it was a long way from noon and 127c surface temperatures.

Finally paper burns at ~233c but plenty of things ignite at lower temperatures. Hay’s ignition temperature is ~130c which considering the moon is 127c and you will concentrate extra reflected sunlight seems very viable.

However, the earth’s atmosphere blocks a lot of moonlight, so it’s more likely to work in a pressurized cabin at high altitude or spacecraft. Even your lenses are going to be an issue.

PS: A lunar day is almost a month long. They are casting long shadows which means they are a long way from noon. https://en.m.wikipedia.org/wiki/Apollo_11#/media/File%3AAldr...


Yeah, this was my immediate thought as well. Yes, you can't concentrate solar energy to heat something hotter than the surface of the sun. But the emission of a 100C blackbody is not visible either, so it's clearly an incorrect hypothesis on it's face that this temperature causes moonlight.

It's a bad mirror, not a radiator.


And it's a mirror that subtends a tiny fraction of the sky, and a tiny fraction of the sun's blurred virtual image.


For sure, the light is super dim. I'm not sure it's possible to focus the energy enough to light a fire. I'm just sure the reason is not because moonlight is entirely composed of radiation from a 100C blackbody. This is just immediately obvious because we can see it, so this whole article is a little weird.


>Moon surface temperature argument is incorrect

To the extent that the moon acts as a greybody under sunlight, it is correct. And like most things, the moon will be close enough to a greybody that you could use the surface temperature as a first order approximation. It isn't necessary of course, since you can easily just directly estimate the amount of scattered / re-radiated energy from the amount of sunlight falling on it, without needing to use the temperature.

>Hence the question is now - can we concentrate moon light enough so that intensity at the concentration point is higher than thermal loss into environment (only then we will be able to raise temperature in the concentration area enough for combustion - remember that light is "hot" enough for this)?

No, because the body you're trying to heat up will act as an approximate greybody, so (neglecting conduction and convection which could further lower the temperature) its temperature has to max out at the Stefan-Boltzmann temperature represented by the total radiation.


Leaving this comment to make sure that others don't get confused by your response.

1. As you mentioned, the moon acting as greybody absorbs part of the radiation and scatters the rest. In the second sentence you mention that the moon's temperature is enough to describe this radiation. In the third sentence you mention that you don't need moon's temperature to do that. So which is which?

Correct answer - to describe re-radiated energy we need moon's temperature, but to describe scattered we don't. We can ignore re-radiated, since it is not visible light and not hot enough. We can use scattered, since it is visible and hot enough.

2. Not sure why it is "No" if you are agreeing with me. Also not sure what you mean by temperature represented by total radiation (as you mentioned in part 1 there are two parts - re-radiated - at temperature of the moon, and scattered - at temperature of Sun).

To see that there is a problem with your argument - first note that the spectrum of black-body radiation dominates spectrum of grey-body radiation [1] (i.e. if black-body radiation is not in visible spectrum, the grey body will not be in visible as well). Then think that at 100 degrees C there would be no visible radiation. So the light that we see from moon cannot be this cold. Its coming from sun and it is hot enough (since it is in visible spectrum).

[1] https://claesjohnsonmathscience.wordpress.com/2012/04/21/gre...


>Correct answer - to describe re-radiated energy we need moon's temperature, but to describe scattered we don't. We can ignore re-radiated, since it is visible light and not hot enough. We can use scattered, since it is visible and hot enough.

When dealing with a gray-body, the equillibrium temperature of the body will be equal to the effective temperature of the incoming light, which will be equal to the effective temperature of the re-radiated + scattered light, since at equillbrium energy out equals energy in. So, assuming the moon is a graybody (and most objects tend to be roughly graybodies), we can use the surface temperature of the moon in our calculations instead of the effective temperature of the light that falls on it.


Suppose you are right. The effective temperature of the moon is 100 degrees Celsius. It is a grey-body, so its spectrum is dominated by black-body - e.g. it would be similar to spectrum of black-body with temperature lower than 100 degrees Celsius. So your argument is that the light we see from Moon is same as light we would see from a black body heated to less than 100 degrees C? Try putting a charcoal in boiling water and please report to us if you see it lighted up as white as the moon.


I think you're mixing up spectral temperature with effective temperature.

Effective temperature is about total power (per the Stefan Boltzmann law), not about the color of the light (per Wien's law).

Compare https://en.wikipedia.org/wiki/Effective_temperature with https://en.wikipedia.org/wiki/Color_temperature


The moon surface is not in equilibrium as there is thermal flow towards moon interior.


> And like most things, the moon will be close enough to a greybody that you could use the surface temperature as a first order approximation.

No, it won't, and no you can't. The OP already pointed out that the Moon is too cold for its blackbody radiation to reach the visible. All the visible light from the Moon is reflected from the Sun. The Sun's radiation's blackbody temperature is the ultimate limit, here, not the Moon's.


As I thought more of why one might get confused with this question is because one might miss that there are two effects here, not one.

To have a combustion we need two things - light of high enough temperature and light of high enough energy concentration. The two are not the same. All visible light has high enough temperature. However the concentration is the problem

Why do we need high concentration for combustion? If we don't supply enough energy to compensate for heat loss, the area will never get hot enough.

By the way - interesting corollary of this is that even with sun lights - if we have a mechanism that takes away heat fast enough (say by blowing cold air at the area) we would not be able to reach combustion with large mirror in direct sunlight.


"if we have a mechanism that takes away heat fast enough"

An interesting demonstration of this idea is putting a paper cup full of water into a fire. It won't burn where the water is touching it and eventually the water will start to boil.


Correct! Concentration is technically called energy flux, and is measured in W/m².


What you say about blackbody being a red herring makes sense to me.

What do you think of the argument at this link, regarding maximum concentration achievable through optics?

https://en.wikipedia.org/wiki/Etendue#Maximum_concentration

I get that that implies a maximum concentration factor of about 10,000, for a light source subtending an angle of 0.54 degrees (the moon's angular size from Earth, and also approximately the sun's.)

Dylan16807 helped to prod me in this direction. https://news.ycombinator.com/item?id=18739190


Right. We can't concentrate light brighter than it is on the surface of the Moon HOWEVER, because the light's effective temperature is much higher than the temperature of the Moon itself, we CAN use frequency-selective filters to heat up a sample, in principle to the point of combustion. Thermodynamics allows this. Basically, the greenhouse effect.


Looking through comments. There is still lots misunderstanding. So let's get deeper into the topic. In particular - how stuff gets heated by light and what light "temperature" is.

Thermodynamics arguments are always good but you really need to understand that most of them describe closed systems under equilibrium. Real world is messier and if your understanding of thermodynamics is shaky its easy to get to wrong conclusions. Black and grey body discussion is in same realm, so we will avoid it here.

Let's consider piece of wood and try to see what it takes to combust it. We need two things - oxygen and high enough temperature. We are keeping this piece of wood in air, so we have oxygen. The temperature is how fast particles that compose piece of wood are moving. The wood consists of molecules, which are in turn consist of atoms. You can imagine a molecule is a bunch of atoms connected with each other by springs (the springs are created by electromagnetic forces when atoms lend and borrow electrons). The higher the temperature - the larger oscillations of these springs. When the temperature is high enough some of these springs can break and combine with oxygen to release energy - combustion.

Suppose light strikes some piece of wood. What happens? Photons hit molecules and they can interact with an electron or proton - they exchange momentum (and energy), which means that one of the atoms in a molecule gets a bump - springs would start oscillating harder.

Note, low energy photons cannot swing springs a lot. We need photons of high energy to swing spring to high temperature (ok - this part is too oversimplified, I can extend on this if there are questions). The energy of photons depends on their waivelength - the shorter waivelength - the more energetic photons. So we need photons with energy that could break molecular bonds (springs) to be able to heat up to combustion temperature. Visible light definitely has photons of such energy.

So we start sending light down our piece of wood, it starts heating up. But it sits in air and probably fixed by some supporting stand. This piece of wood starts exchanging heat with surrounding materials. To be able to overcome this we need to send lots of photons on this piece of wood. How do we do this? We use lens to collect the photons from broader area and send it down to the wood. Note that it doesn't matter where photons came from, were they scattered or produced by sun, what temperature of scattering surface was, not even what temperature of sun was. All what is matter - can we collect enough high energy photons. So we would need to calculate what is energy flux of visible light photons at the surface earth and see if we could come up with a lens to focus these photons on piece of wood to create sufficient intensity.

We know that we can see Moon in visible light, so it does sends bunch of energetic photons to us. All these photon are coming from moon direction (so we don't care that moon scatters them in other directions as well). So the task is to find what is the energy flux of these photons at earth surface and what size of lens we would need to get sufficient intensity.


Your argument about low energy photons being capped at a black body level for heating is nonsense, a system can absorb all photons it can emit so typical matter can more or less absorb everything. Once energy from a photon has been absorbed it will diffuse in the material, increasing its temperature, energy quanta doesn't come into consideration here.


I am not an expert but on your oversimplification on the ‘High energy = higher wavelength’

Whilst this is true, it is also a bit irrelevant. Gamma rays are higher energy than visible light but you don’t need to focus gamma radiation (should that be easily possible) to start a fire, you can do it with visible light; or infrared light (which is lower wavelength again) at a high enough qanta


I don't have much to add, but I believe the poster you are replying to said that 'High energy = shorter wavelength'.

Equivalently, higher energy photons are higher frequency.


Higher energy = lower wavelength is not oversimplification. The formula is: photon energy is plank constan times speed of light divided by wavelength. You right - you don’t need gamma rays, but you cannot do it with low frequency radio waves. You need photons that energetic enough to break chemical bonds.


Depends on what you consider low frequency. Radio waves can be used to excite vibrational modes. The resultant thermal energy can be used to break chemical bonds.


Trying to think about this some.

If the Moon was actually a giant mirror-lens thing that was the size of the Moon, but perfectly shaped to reflect and concentrate sunlight shining on it into as small an area as possible on the Earth's surface, you'd start a fire with that very easily. Probably turn a county-sized area of the crust into lava or something actually.

On the other hand, if the Moon was a perfect reflector, but a big sphere/hemisphere pointed at the sun, then it would receive a lot of high-energy radiation, but reflect it back out in every direction all over the universe. I'm not sure of the math for that offhand, but it sounds difficult to concentrate it back to something even as concentrated as direct sunlight. I suppose it would involve considering the fraction of the total solid angle the reflecting area of the Moon would shine at. Even the entire Earth's surface would be a really tiny fraction of that area, so sounds impossible to concentrate things back to fire-starting intensity.

Assuming these scenarios are basically right, then the actual Moon is a lot closer to the second than the first. So starting a fire isn't practical with any lens system you could build on Earth. Maybe you could build some massive mega-lens thing in space near the moon that concentrated enough of it back on the Earth to be near-sunlight brightness. Or just set up a more manageable sized reflector on the Moon itself, or in space, to reflect sunlight in a more concentrated way to a small area on the Earth.


I think there may be an astronomer or two (at Lick or McDonnald?) which have 30"+ telescopes and actual human usable eye pieces (most only have CCD mounts), which could actually run this experiment.

I'm sure they don't want smoke in the observatory, but you could do it in a box. Then you could fill the box with variable oxygen levels (to make up for altitude) and substitute N2 with Ar/Kr/Xe to change the heat dissipation.


Spectral composition is completely irrelevant for typical material and temperatures, it is just watt per square meter which matters. You can't concentrate watt per square meter to a higher level than the light had at its quadratic falloff source. You can show that this is true by using the blackbody example, concentrating its light would break the laws of thermodynamic. Moonlight has quadratic falloff from the moon and not the sun and is therefore impossible to concentrate further so at best you can use lenses and mirrors to get the same light intensity the moonlight have on the Moon which of course isn't enough to heat anything above the temperature of the Moon.


> You can't use lenses and mirrors to make something hotter than the surface of the light source itself.

This is an interesting argument. Can I not reflect some sunlight off a mirror, then do the magnifying-glass-to-start-a-fire trick in daytime? Doesn't the mirror stay cool? Isn't the moon just a (poor) mirror for the sun's light?


You've found the right question to ask.

Your mirror in sunlight works because the reflectivity or albedo of the mirror is very high relative to whatever target you're lighting on fire.

In a magical closed system where radiative heat transfer was the only factor, objects of differing reflectivity would eventually reach temperature equilibrium through black body radiation.

We aren't interested in closed systems though. The moon's temperature is set by the equilibrium between incident solar radiation and black body emission, most of which flies off into deep space making the system very open. Just like your mirror's temperature is the equilibrium of incident radiation, black body cooling, and convective cooling in Earth's atmosphere.

If the moon had a high reflectivity and/or a powerful cooling mechanism like convection, its equilibrium temperature would be far lower than the temperature of its emitted + reflected light. Unfortunately the moon's albedo is just 0.12 and black body radiation is all it's got, so the modest difference between its temperature and that of its light isn't enough to start a fire.


The equilibrium temperature of a grey-body is actually independent of its emissivity. A black-blackbody absorbs more radiation than a white-blackbody, but it also emits more. The emissivity factors in both on the absorption and emission side, so the equilibrium temperature is independent of it.

If this seems to fly in the face of all common experience, it's because things that look white aren't actually white in the infrared. Their emissivity is lower in the optical spectrum (where they absorb sunlight) than in the infrared (where they emit), so they are cooler than things that look black to us. In short, they aren't grey-bodies.

The situation is also more complicated because, as you say, that things we are used to also cool by other mechanisms.


Can you describe what black-blackbody and white-whitebody are?


Really?

According to the Stefan-Boltzmann Law, the energy put out in blackbody radiation is proportional to temperature to the 4th power. Therefore something that is 5000 degrees reflected off of an object with albedo 0.12 is putting out 0.12 times the energy it originally did, while something that is 2500 degrees only puts out (1/2)^4 = .0625 times as much. So the "temperature of the Moon's light" should be more than hot enough to light something on fire if it is focused right.

As a sanity check, compare how much light the moon puts out as a black body in shadow with what it reflects from the Sun. As another sanity check, compare how bright the Moon is versus a fire.

What am I missing here?


The angular concentration of the light.

The moon is diffuse, so an incoming ray of sunshine is spread by the optically rough surface of the moon from an incident solid angle of 6 * 10^-5 steradians out into 2 pi steradians of the night sky, or a reduction in angular concentration by a factor of about 100,000 (totaling ~1 million after the albedo is accounted for).

It is like you are looking at the sun through a mirror so rough that the image of the sun is blurred over literally half the sky. Because this process does not create new photons, the blurred image must be far, far dimmer. This circumstance corresponds to the "most we could do" with lenses and mirrors focusing the moon, which is to fill the sky with an image of the moon/"blurred sun".

Unconcentrated moonlight corresponds to the same picture, except we only see a "cutout" disk of this blurred sun-image which is the size of the moon in the sky. Our crappy moon-mirror does not fill our vision, it is a porthole letting through only a tiny fraction of the blurry sun-image.

And of course if you imagine yourself as the ant under the magnifying glass, with your entire sky filled with moon, there is no way you could spontaneously become hotter than your moon-y surroundings.


This is the primary reason. You can no more start a fire with sunlight reflected off the moon at night than with sunlight reflected off a sheet of paper during the day. (In fact you can do better with the sheet of paper, because you could in theory surround it with lenses to recapture the diffuse light.)


As a comparison the moon and sun are roughly the same size in the sky from earth. The sun puts out about 1000 watts per square meter on a clear day at noon in summer, the moon puts out about 0.0025 watts per square meter during full moon on a clear day, 1/400000 as much.


Your figure for the energy from the Moon seems overstated. https://education.seattlepi.com/moonlight-strong-enough-powe... quotes a ratio of 2.3 million to one.

However the key point is this. The Moon is in reality 400 times smaller than the Sun. Which means that an optimally placed lens can actually make the image of the Moon 400 times smaller, for 160,000 times as little area. If the lens is big enough that this target area is mostly losing heat through black body radiation, again it should wind up over half the temperature of the surface of the Sun. Which is more than hot enough to start a fire.


> something that is 5000 degrees reflected off of an object with albedo 0.12 is putting out 0.12 times the energy it originally did, while something that is 2500 degrees only puts out (1/2)^4 = .0625 times as much

I don't understand this bit. I don't think there's any power law involved in reflection, if something has an albedo of 0.12 then it just reflects 0.12 times the incident radiation, doesn't it?

I think I agree with your overall point, though, which is that the moon isn't a black body radiator (well it is but only at a couple of hundred degrees C at most) but is just reflecting the sun's light (and those photons are hot enough to start fires).


All photons can start fires. They arent hotter or colder individual photons, just photons at different energy levels/wavelengths/relative velocity. Heat happens once photons collide with stuff. Heat is a group effort. Get enough photons to hit something and it will warm. Photon colour, and the reflectivity of the struck object, alters the needed number but with infinite photons fire (300*?) is always possible.

Fiber optics could probably collect and point enough moonlight to light a match. If we call fiber a sort of flexible lens, then lenses can start fires.


Sorry, this is wrong. Please don't "explain" your misunderstanding as if you know the answer; ask questions instead.

If we imagine a full sphere of inward-pointing fibers, each one "looking" at the moon, then we see moon-surface in all directions from within this contraption. We are in a thermal bath of moon-temperature. We will not get hotter than the moon.

And fibers are not needed to create this circumstance. The same situation ("moon visible in all directions") could be created with a few lenses and mirrors.


Just collect all the photons from the moon using solar panels (moon panels?) Store the energy in a battery. When you have a big enough charge use it to light whatever you want. Sure, it isn't practical and it is not what you guys are really talking about, but you can definitely make something hotter than the moon, as long as you are willing to cheat.


The point is interesting. A CO2 laser at 10 microns can melt steel, which emits a whole lot of 0.5 micron photons.

The problem with sunlight reflected off the moon is that it's like having a very low transmittance ND filter in the optical path. There aren't a whole lot of photons entering the aperture, and as the material begins to heat, there are more photons and overall probably more energy going out. (I'm not giving an answer here, just some thoughts.)


What's ND?



Not the system i was talking about. None of the fibers would "look at the moon". Fiber isnt the same as lenses.

Note too that i said match. Getting tiny bit of a matchhead hot enough to decompose isnt the same as burning an ant. A matchhead on the surface of the moon would probably ignite just fine (150*+o2).


Fiber is very much the same as lenses.

(Well, the fiber you’re thinking of is. Fluorescent fiber is not, but that’s another story.)


Again you are "explaining" when you should be asking.

> None of the fibers would "look at the moon". Fiber isnt the same as lenses.

I think you meant to ask "how is a fiber different from a lens?" instead of asserting "fiber isn't the same as lenses," which turns out to be completely wrong.

To a physicist, a lens is a shaped piece of refractive medium-- that is exactly what a fiber is. There is no magic inside the fiber; it does not add any photons. You are looking through a shaped piece glass, through which you will see a (possibly very distorted) image of what's on the other side.

In this case, it's the moon that is on the other side. The most you could do is surround your match/ant/whatever with fibers/lenses which are showing the moon on the other side of them. And from this fact it is unavoidable that you cannot make your surrounded subject hotter than the surface of the moon. If you are trying to say anything to the contrary, I'm sorry, but you are Flat Wrong.

Any other configuration of fiber would give you less moonlight than completely surrounding the subject, so it's not going to improve the odds of starting a fire.

> Note too that i said match. Getting tiny bit of a matchhead hot enough to decompose isnt the same as burning an ant.

I think you meant to ask "does the material or shape change anything?" instead of asserting "it isn't the same," which again, in the context of this question, turns out to be completely wrong.

The maxiumum temperature that can be imparted by a lens/mirror system doesn't depend on what it's focused on at all. In the long run, the ant/match/whatever will reach that maximum temperature and not get hotter. From a thermodynamics perspective, it is like putting an object in an oven of a particular temperature-- It doesn't matter if you put in a brick, or a cake, or match, or an ant; eventually they will all be 250 degrees if the oven is set to 250 degrees. In this case, the "oven" is the moon, and it's set to about 120C. A match combusts at about 600C. There will be no combusting of the match by putting it in the moon-oven. Sorry. End of story.

And if I can give some advice: If what you're hearing doesn't make sense to you, it is always safer to ask a question than to blindly assert your gut-guess of how you think it is. Someone who thinks they're an expert but spouts nonsense looks like a fool. Someone who asks a question, though, looks curious, which is smart. These two people have the same degree of knowledge, but one comes off looking much worse than the other. That's why asking questions is better, especially when you don't know the knowledge of the people who are listening.


If your fiber network could heat the match hotter than the moon, why wouldn't the match heat the moon instead? How do you cram all the ends of those fibers (with total surface area equal to the moon) into a target smaller than the moon? Fiber optics aren't lasers.


Because reversing the system isnt straitforwards. Fiber tends to bend light back towards the middle of the fiber. Shining light back down the middle doesnt mean it will reappear at the same point, which is important for many types of lasers.

Archers see this. Many archery sights use fiber to make nice illuminated dots, without batteries or leds. The light appearing out the end of the fiber is brighter than the skin, a rare practical use of "naked" optical fibers.

https://www.nanoptics.com/service/replacement-bowsight-fiber...

The above are junk fiber (little internal lensing) but you can see the effect.


> black body radiation is all it's got

No, reflection and black-body emission are different. The moon primarily produces the former.

Moonlight is not the result of the moon glowing incandescent.

The surface is very cold, its emissions as a black-body radiator are in the far infrared.

Moonlight is white light with a color temperature of several thousand Kelvin. This is reflected sunlight and has nothing to do with black-body radiators.


If the light is being radiated from a black body surface, then you cannot make something hotter than that surface. But you're right, the light from the moon is reflected sun light, plenty hot to start a fire. The moon also radiates like a black body, but virtually all that light has longer wavelengths than the reflected visible light.

You could start a fire from the light of a single star if you had a big enough lens. You could also start a fire with the light from a hand mirror at the distance of the moon if it reflected the sun's light at you. But you'd need a very big lens.


> But you're right, the light from the moon is reflected sun light, plenty hot to start a fire.

That's my impression as well. A blackbody has albedo 0. The moon has an albedo of around 0.12. While I suspect you can't start a fire from moonlight in practice, I don't think the arguments in this article are correct.


While I suspect you can't start a fire from moonlight in practice, I don't think the arguments in this article are correct.

I guess if you're reading that as some kind of absolutely logical argument. I read stuff like that as an abstraction which just kinda works in the messy real world. Pretty much like how the typical explanation for how a wing works turns out to be an over-simplification. It only partly works that way. Actual wings are complicated, but in the aggregate, they just get enough air molecules to go downward to net out the forces to keep the plane from going downward. It turns out that there are a lot of mechanisms contributing to this all at once. (Which is something else he discusses in that series.)


It’s presented as an absolute logical argument of the form “you can’t use a lens to make anything hotter than the average surface of the moon.”


And it "kinda" works in the same way the simplified Bernoulli explanation of the wing works.


>While I suspect you can't start a fire from moonlight in practice

Ivanpah may be able to, I wonder if we can borrow it at night for an experiment - https://en.wikipedia.org/wiki/Ivanpah_Solar_Power_Facility


now this is an idea i like.


I wonder what the protocol is for 'borrowing' Ivanpah to point it at the moon at night and at what point in the exchange the phrase, 'well, you won't be using it', will finally occur.

edit - And by whom. Should probably get it in early, just in case.


>But you're right, the light from the moon is reflected sun light, plenty hot to start a fire.

It's mostly not reflected. It's mostly scattered. If it were specularly reflected, there would be a bright spot on the moon (where you could see the reflection of the sun), and you'd be able to concertate light from that to light a fire.


With a sufficiently isolated vacuum flask containing the tender and with an optical opening only large enough to view the moon, you wouldn't even need to concentrate the scattered light from the moon. The temperature of the inside of the flask would eventually reach the approximate blackbody temperature of the reflected light from the Sun (corrected for whatever frequencies are absorbed by the Moon).

The only reason the moon's surface doesn't reach such high temperatures is that the moon's surface is not thermally isolated from the rest of the night sky or from the body of the moon itself (which is also not isolated from the night sky).


Thunderf00t should totally make a lens to start a fire with starlight.


I think you'd need a different argument to show that's impossible, since the surfaces of stars are very hot, and their light is unmediated.


The thermodynamic argument applies to black bodies, and says that you can't make an object hotter than the surface of the emitter. That's pretty uncontroversial.

The more general argument based on etendue, which applies to the moon, is: you can't make the incoming light any brighter than it is on the surface of the source (which doesn't have to be the original emitter, but can be any point along the path of the light). As a corollary this happens to mean you can't really make something hotter than a rock on the moon.

Suppose the moon was actually a flat mirror. Standing on the mirror-moon, you look at the ground. In most directions you would see the darkness of space (with a few reflected stars), but in one spot you would see about ~(0.5°)^2 solid angle of extreme brightness - the sun's reflection in the mirror. Standing on the mirror-moon, you could certainly use a magnifying glass to heat something to ignition temperature (ignoring the lack of oxygen) by making its environment that bright using the reflected light.

Similarly, if the moon was a mirror, you could certainly use the reflected light to start a fire standing on earth (provided you're lucky enough for the moon/earth/sun to line up just right so that the reflection of the sun is visible through small angle subtended by the moon from earth). It would basically appear as another sun in the sky when lined up properly.

In reality, instead of reflecting light like a mirror, the moon scatters light in all directions. Standing on the actual moon, looking at the ground, what you see is a lot of moon dirt, all of which about equally bright (ie. not very). The scattering smooshes the sun's light out in all directions, ensuring there's no visible "reflection" of the sun in any direction.

The most you can do with this scattered moon light is make the environment of an object as bright as the (mediocre) brightness of the moon rocks. But moon rocks already experience that environment of mediocre brightness, and reach only 100°C, so you won't be able to make your object much hotter than that.


you won't be able to make your object much hotter than that.

I'm bothered by the modifer "much". If you are indeed talking about a physical principle, shouldn't this be an absolute limit rather than a suggestion? How much hotter does physics allow you to go? Are you sure it's not enough to allow ignition?

Along those lines, I'd assume that the surface temperature of the depends on the moon's shape and thermal conductivity. If I were to change the moon to be an ultra-thin and highly heat conductive hemispherical shell rather than a solid sphere, I'd assume the surface temperature would drop.

Assuming the amount of light reflected remains the same, does this imply that the maximum achievable temperature on earth with a magnifying glass drops as well? I don't see any physical reason that it should, but your logic would seem to imply that it must. Can you explain?


> I'm bothered by the modifer "much". If you are indeed talking about a physical principle, shouldn't this be an absolute limit rather than a suggestion?

It's an absolute limit on the amount of incoming irradiance you can create to your object. The actual equilibrium temperature it reaches will depend on additional factors like how well your object loses heat (eg. by conduction) compared to a moon rock.

In this case, the temperature of moon rocks is probably a reasonable upper bound of the achievable temperature of an object on the earth:

- Moon rocks are in vacuum, while something on earth is in contact with air and dissipating heat by convection.

- Moon rocks are in contact with the surface of the moon (~100°C), whereas an earth object is in contact with the ground, or your hand, or whatever (~37°C, assuming your hand). So heat loss by conduction will be greater on the ground.

If you rigged up something to suspend your object in vacuum without touching anything so that conductive heat losses ~0, maybe you could get something slightly hotter than the average surface temperature of the moon. But not hotter than a well placed moon rock that already happens to be making near 0 contact with the moon's surface (due to standing on a point or something).

> If I were to change the moon to be an ultra-thin and highly heat conductive hemispherical shell rather than a solid sphere, I'd assume the surface temperature would drop.

In that case much more heat would escape around to the unlit side, and the moon's surface temperature would reach somewhere between the "day" (~100°C) and "night" (~-200°C) temperatures. Say around -50°C. In that case the surface temperature will be less representative of that achievable for an object on earth. A moon rock touching the ground would be in contact with -50°C, which is colder than the 37°C for an object held in your hand.


Thanks for the reply, and I think I agree with all the physical processes you describe, but I'm not convinced that your approximations are correct. I'm going to keep pushing a bit to see if we can resolve this as well.

[The surface temperature of the moon is] an absolute limit on the amount of incoming irradiance you can create to your object.

This is true for a black body, but why are you convinced this is true for the actual moon? I think we agree that a more reflective moon could have a lower surface temperature while increasing incoming irradiance on the earth. And we both agree that the moon is partially reflective. Doesn't this mean that the surface temperature is not an absolute limit?

I think the correct statement is that the intensity of light from the sun to the moon gives a limit on both the surface temperature of the moon (highest if we assume the moon is a blackbody) and a limit on the amount of sunlight reflected toward the earth (highest if we assume the moon is a perfect reflector).

Since the moon absorbs about 90% of the light incident on it, we can assume that the surface temperature is lower than it would be if it was a perfect black body, presumably reaching a temperature corresponding to a sun that was about 10% less strong. The 10% of light that is reflected, although diffused in all directions, is much more intense when viewed from earth than low energy blackbody radiation that is also emitted. We know this intuitively because the sunlit moon is much brighter at night than the non-sunlit portion, and because the visible light is more energetic than the infrared, but could integrate across the energy spectrum to find an exact answer.

As such, unless we are willing to make some additional assumptions, I don't think we can make any firm claim about the the maximum temperature achievable on the earth using lunar reflected sunlight based only on knowledge of the surface temperature of the moon. In practice, the scattered sunlight doesn't provide a lot of energy, so heating with it will be difficult. But it's energy incident on the earth that matters, not the temperature of the lunar surface.

Would you agree with this summary? Are there additional assumptions that you think should be added that would provide the tighter limit you want? Alternatively, is there something other than "[The surface temperature of the moon is]" that you think I should have substituted for "It's"?


> [The surface temperature of the moon is] an absolute limit on the amount of incoming irradiance you can create to your object.

It's the radiance (brightness) of the moon (as perceived at the moon) that is an absolute limit on the incoming irradiance you can create, because of conservation of etendue.

Separately, the fact that moon rocks (which experience that exact amount of irradiance) reach some given temperature X°C while losing very little heat to conduction (certainly less than an object on earth would), shows that this level of irradiance is insufficient to heat up your kindling above X°C.

This argument has nothing to do with the moon being a black body, or an approximate black body, or any such thing. Just the fact that moon rocks, which are exposed to this light (with very little heat conductive losses), function as a kind of a thermometer which tells you how hot that light can make something[1]; and the answer is "a bit over 100°C"

> Since the moon absorbs about 90% of the light incident on it, we can assume that the surface temperature is lower than it would be if it was a perfect black body, presumably reaching a temperature corresponding to a sun that was about 10% less strong.

No... An object with 90% absorbance absorbs 10% less light energy, yes. But it also emits 10% less light energy, so the two effects cancel out and it reaches the same equilibrium temperature as a black body.

[1] https://en.wikipedia.org/wiki/Effective_temperature


It's the radiance (brightness) of the moon (as perceived at the moon) that is an absolute limit on the incoming irradiance you can create, because of conservation of etendue.

Well yes, this is what it should say! Is this our disagreement? Because for me (and I think for most other dissenters in this thread) the whole problem we have with Munroe's argument is that he keeps coming back to the surface temperature of the moon as the limiting factor. If he was simply to say that the moon is not bright enough, then we'd probably all agree.

the fact that moon rocks ... reach some given temperature X°C ... shows that this level of irradiance is insufficient to heat up your kindling above X°C.

I think this is the real point of dispute. We agree that this is true if we are only considering pure blackbody radiation. What's not clear (at least to me) is that this equivalence is still true when you include the directly reflected light. That is, no one thinks that you can start a fire using only the thermal infrared light from a dark moon. The question is whether it's hypothetically possible with a sufficiently bright sun and sufficiently reflective moon, without raising the surface temperature. Can you point to something that makes this argument more directly?[1]

the two effects cancel out and it reaches the same equilibrium temperature as a black body

I need to learn more about this. I have trouble thinking it applies correctly here, because it's assuming the moon is a perfect gray body. I think this assumption falls apart if it's actually reflecting light, which in fact we know it is. Or am I wrong? Does a silver mirror in space actually end up at the same equilibrium temperature as a lump of coal? I guess it could. This wouldn't harm much argument (the argument just requires that the temperature not increase), but would indicate that I'm not viewing things correctly.

Summarizing, I think the point of dispute is whether the surface temperature of an object in space can always be reasonably estimated from its brightness (and vice versa). We agree that it can be if it's a perfect black body. We agree that it's mathematically true if it's a "gray body". We disagree (I think) as to whether it's appropriate to make the simplification of assuming that all stellar objects are sufficiently close to "gray bodies" for the math to hold.

[1] Here's the outline of the counterargument. Start with a blackbody moon. Estimate that with perfect optics you can heat an object to X. Now increase the reflectivity of the no-longer-black-body, noting that the surface temperature does not increase. I'd argue that when you increase the reflectivity, the moon gets brighter, and thus you can heat your object to a higher temperature. You seem to be arguing that because the surface temperature remains the same, the attainable heat stays the same, even though you can collect more reflected energy.


> Does a silver mirror in space actually end up at the same equilibrium temperature as a lump of coal?

Yes. It takes longer to reach the equilibrium, but it does reach the same equilibrium temperature.

> [1] Here's the outline of the counterargument. Start with a blackbody moon. Estimate that with perfect optics you can heat an object to X. Now increase the reflectivity of the no-longer-black-body, noting that the surface temperature does not increase.

Right, so, let's say the total solar power received by the blackbody moon is Ps. The moon increases in temperature until it reaches temperature Tb, at which the power emitted equals the power absorbed Pb = Ps. That is the equilibrium.

Now let's let the moon reflect (scattering) some light, by giving it a realistic albedo of 10%. Then of the solar power received at the moon, 0.9Ps is absorbed and 0.1Ps is scattered. But this albedo also causes the moon to emit 10% less light, so it now emits 0.9Pb at the same temperature Tb, hence it reaches equilibrium 0.9Ps = 0.9Pb at the same temperature.

Note that the total light energy leaving the moon is now: 0.9Pb (emitted thermal) + 0.1Ps (scattered) = 0.9Ps + 0.1Ps = Ps. Just the same as the amount that was emitted as a blackbody. So in fact it's not any brighter (in terms of power), just has a different spectrum.


Thanks for sticking with me on this. Your "conservation of energy" argument is clearly correct. If we presume a constant temperature for the moon, the constant incident energy from the sun has to be going somewhere. That which is reflected is reflected, and that which is absorbed is eventually emitted as infrared. But the details!

The first is that the applicable "mirror" question here whether the surface temperature of the sunlit side of a slow rotating astronomical body is independent of reflectivity, not whether the eventual core temperature of a uniformly lit body is independent. Are you confident that the same reasoning applies? I'm not yet.

Next, if we are phrasing our question as to whether one can start a fire with a magnifying glass, we clearly do care about the difference between visible sunlight and low temp infrared. For Munroe's argument to really work, the surface temperature needs to be proxy for the collectible visible light that would be used by a magnifying glass.

There's also the directionality: none of the visible light is going to be reflected toward the "dark" side. The thermal radiation is also directional, but not to the same extent. As we move toward greater reflectivity, presuming a bright full moon, we do get more of the total energy available on earth. How much more? I don't know.

Lastly, which we haven't discussed, there are options for insulating the heated object on earth that are optically transparent (to allow the concentrated light in) but infrared reflective (to prevent thermal radiation from escaping). I think this "privileges" the reflected sunlight so that we might indeed be able to achieve a higher temperature than the reflective moon surface in vacuum.

I feel like you recognize these factors also, by your caveats that you might be able to get "a little bit" higher than the surface temperature. Without committing to a number or methodology, your implication is that this "little" must is small relative to the surface temperature of the moon, rather than small relative to the optical temperature of the sun. While I agree that the surface temperature is related to the achievable temperature for collected reflected light, I still don't think that the exact temperature is a hard limit.

I'll try to bow out here and not take up more of your time. You've definitely helped me to think through the issues here. If you happen to be interested in a somewhat parallel situation, you might enjoy this article that describes a cyclical water collection system based on collecting solar energy through an aerogel during the day, then using a condenser optically coupled to the dark sky at night: https://www.nature.com/articles/s41467-018-03162-7. Only loosely related to Munroe's hypothetical, but shows a real application of some of the same concepts.

Edit: I just noticed a link on the second page of this thread that had some useful discussion that might interest you: https://physics.stackexchange.com/questions/370446/is-randal.... If you expand out all the comments on the answer, Shor seems to be making the same argument you are, and Lalinský is making a better version of mine.


Can I not reflect some sunlight off a mirror, then do the magnifying-glass-to-start-a-fire trick in daytime? Doesn't the mirror stay cool?

The mirror is a part of you hybrid reflective/refractive light concentration system. Of course it stays cool, otherwise your system is inefficient.

Doesn't the mirror stay cool? Isn't the moon just a (poor) mirror for the sun's light?

Precisely. So the maximum temperature that system can achieve is much less that your mirror/magnifying glass system. A perfect black body radiator is just about the poorest mirror there is. The moon's light isn't what you'd get from a perfect black body. However, it's a long, long ways away from being a perfect mirror and probably far closer to being a black body.


> So the maximum temperature that system can achieve is much less that your mirror/magnifying glass system.

Why? If the moon were a perfect black body, but with a few small perfect mirrors on the surface to bounce a bit of sunlight towards Earth, couldn't we (in theory) focus that tiny bit of sunlight to heat something up to the temperature of the Sun? There may not be enough solar radiation to make it feasible in practice, but it seems like you need to actually do the math instead of just saying "the moon's surface temperature isn't high enough to start a fire, Q.E.D."


Why? If the moon were a perfect black body, but with a few small perfect mirrors on the surface to bounce a bit of sunlight towards Earth, couldn't we (in theory) focus that tiny bit of sunlight to heat something up to the temperature of the Sun?

To get past the diffraction limits, those "few small perfect mirrors" on the Moon would have to be gargantuan. (If I'm remembering correctly, they'd be the size of the alien space ships in Star Trek that make the Enterprise-D look teeny.) Otherwise, viewed from the surface of the Earth, those mirrors would just constitute a minuscule brightening of the sky in the direction of the Moon.


But you could instead cover the whole moon with some kind of dust that reflects 12% of incoming irradiation in average. You're arguing technical problems with a toy example illustrating an unrelated point.


Each of those specks of dust modeled as a tiny mirror would be very much subject to diffraction limits. This would basically make what you propose equivalent to an approximate black body that's like asphalt with 12% albedo.

Dandy.


The moon is about as reflective as asphalt.


So it's approximately a black body. Even asphalt has tiny bits of specular reflection coming off of it.


The other obvious question is: could we mold one of the moon craters to a gigantic parabola and coat it to be (much more) reflective? And have the moon shine a concentrated death ray down to earth? We'd obviously have trouble aiming - but it should make for a spectacular experiment....


This works because you're just concentrating the image of the sun. It's the surface temperature of the sun that matters. Now if, instead, the mirror were scattering or absorbing and re-radiating the light (as the moon does), then its temperature would be limiting factor.


Now if, instead, the mirror were scattering or absorbing and re-radiating the light (as the moon does)

Are you saying that mirrors do something other than absorbing and re-radiating light? In most explanations I've seen, this is exactly what they do:

How does the mirror reflect light? The silver atoms behind the glass absorb the photons of incoming light energy and become excited. But that makes them unstable, so they try to become stable again by getting rid of the extra energy—and they do that by giving off some more photons. (You can read about how atoms take in and give out photons in our article about light.) The back of a mirror is usually covered with some sort of darkly colored, protective material to stop the silver coating from getting scratched, and also to reduce the risk of any light seeping through from behind. Silver reflects light better than almost anything else and that's because it gives off almost as many photons of light as fall on it in the first place. The photons that come out of the mirror are pretty much the same as the ones that go into it.

https://www.explainthatstuff.com/howmirrorswork.html


I haven't done material physics in depth, so can't be 100% certain of this, but I don't see how that can be true. Electromagnetic reflection is what results in either specular reflection or diffuse reflection (i.e. scattering), not absorbing and re-radiating. Which is why the color of reflected light is dependent on the color of incoming light, and not on the spectral frequencies of the material (unlike re-radiated light).


Wow... separately I had no idea the surface of the moon reaches (and goes above) 100°C... that's hot! Literally boiling hot. Turns out at night it goes down to almost –200°C. That's insane.

Quick searching of how the astronauts survived this, turns out it seems they timed landings to the lunar dawn for an in-between temperature, that the lunar surface doesn't conduct heat well (all dust?), of course there's no atomsphere to conduct heat, and that their boots were extremely well-insulated.


Also, since the moon is tidally locked to the earth, it rotates at the same rate it revolves, which means that sunrise and sunset only happen once a month, which makes dawn a pretty long time.

There's a pretty good song about how the moon's day/night cycle would affect a lunar mining colony: https://www.youtube.com/watch?v=GDPUdUGJpjc


Ah, that makes so much more sense then... two weeks of straight unfiltered sunlight is gonna heat things up... and two weeks of total darkness gives it time to freeze, freeze, and freeze some more.


Yup, lunar landings didn't last long enough to experience the full range of extreme lunar conditions.

This is one of the reasons why lunar bases will actually be more difficult than people think. Dealing with extreme environmental conditions is much easier when they are stable, then you can design around them and deal with them. Dealing with conditions that cycle from one extreme to another continuously over extended periods of time is much more challenging. One point of proof of that is the longevity of the Martian and lunar rovers and landers. Many landers and rovers on Mars have lasted for years, some have lasted over a decade. No lunar rover has been able to maintain roving operations longer than a few months. This despite the fact that lunar rovers are in near real-time contact with Earth continuously. The heat/cold cycles and hyper-abrasive clingy dust make the lunar environment particularly harsh on equipment.


Earth would have the same temperature extremes without the atmosphere to moderate them. Artificial satellites also have to deal with this; there are hundreds of degrees variation when the satellite is in sunlight vs the Earth's shadow. And even between the shadow side of the satellite itself vs the sunlit side.


Thermodynamics argument seems iffy to me. I can cover the entire surface of the earth with solar panels to harvest the moon light and use the combined electricity to melt iron. If this is ok with thermodynamics then so is using lenses.

The rest of the argument seems to be that light cannot be optically condensed to a single point as there will always be some dispersion due to diffraction, and the size and shape of the dispersion is dictated by that of the source. That is you can't make the target denser then source, hence the temperature must be less.

EDIT: Several commenters submitted that lenses are reversible while solar panels are not, and this makes all the difference. My retort is that I can make non-reversible lenses by covering them in a thin layer of dust. Since the lense system is now non-reversible can I use these sub-par lenses to create higher temperature than I could with clear lenses?


> I can cover the entire surface of the earth with solar panels to harvest the moon light and use the combined electricity to melt iron. If this is ok with thermodynamics then so is using lenses.

There's a difference. By going through solar panels you would increase entropy because the conversion of sunlight to electricity can't be 100% efficient[1]. And if you allow the solar panels to heat up too much because of that waste heat, then that efficiency will drop further.

If a lens could heat a point to a temperature higher than the sun, then there'd be no such loss and you'd be magically decreasing entropy.

[1] https://en.wikipedia.org/wiki/Thermodynamic_efficiency_limit


The region of higher temperature is much smaller. Entropy is not a pure measure of energy or randomness, but a measure of that within a volume. You'd have smaller entropy in a small volume, but higher around it because you would have diverted the ray that would have hit outside the concentrated region.

Were is this reasoning wrong?


Yeah this is all rather confusing...

This boils down to a moderately heated BB receiving an large stream of moderately powered photons and either rejecting them or first absorbing and then radiating them away at the same pace without changing its own temperature, regardless of how many photons are coming in.


Yeah I don't see how this argument holds water at all. If you have 10 kW worth of photons focused onto a square inch that absorbs 90% of those photons then that's 9 kW of power that isn't just going to magically disappear and it's not going to reach equilibrium until it's emitting 9 kW of power itself which is certainly going to be thousands of degrees. While the entropy argument is certainly an interesting thought experiment all that means is that the global entropy must somehow still be increasing.


I was wondering about this too. After all, human beings have produced temperatures that were hotter than the sun's surface and almost all of the energy that we used to do that ultimately came from the sun (except for very small components that came from nuclear reactions on Earth and geothermal energy, which also comes from nuclear reactions inside Earth).

I think that this issue is addressed by footnote 2

> And, more specifically, everything [lenses and mirrors] do is fully reversible—which means you can add them in without increasing the entropy of the system.

Presumably we can't say the same of the electrical devices that we use to collect, store, and transmit sunlight, or to create high temperatures from this stored energy. For example, if you use a "lunar panel" to store energy in a battery and then heat something on an electric stove, many of the components in this process will not be reversible, differently from mirrors and lenses. (Putting a hot object on top of the stove won't cause the lunar panel to emit light back in the direction of the moon!)

So I think footnote 2 is actually very important, because it's not that we can never use any energy source to create something hotter than that source, it's that we can never do so using only reversible processes, including purely passive optics.


I can convert the optics to non-reversible by spreading a thin layer of dust on top. Now both systems are non-reversible and both allow higher temperatures? Or are we looking at a particular kind of non-reversibility?


I think the thermodynamics argument is that only non-reversible systems cause energy to flow from a lower-temperature source to a higher-temperature destination, not that all non-reversible systems do so. For example, if you set up a solar panel that directly feeds a resistor, you have a dissipative system that's unlikely to get hotter than the sun. However, a solar panel array and/or battery could conceivably power a fusion reaction that does get hotter than the sun.


>if you set up a solar panel that directly feeds a resistor, you have a dissipative system that's unlikely to get hotter than the sun.

I guarantee you this can be done. Arc welding [1] goes to many thousands of degrees, up to 20,000C. A square mile of solar panels will most certainly give me enough energy to power an arc welder. In fact I will probably will get away with just 100kw of pwoer, so 500 panels give or take.

[1] https://hypertextbook.com/facts/2003/EstherDorzin.shtml


Okay, sure, as long as you're not trying to say this contradicts the post. "A resistor" was just an example of how you could waste the power and not get much heat.


Covering the lens with dust would let you light a fire. The dust will warm slightly above ambient temperature in the lens' shadow, so you can use it to power a Carnot engine that winds up a spring that is then released to rub a twig against a log.


This is genius.


Wouldn't this layer of dust also stop the lense from working normally? If I understand the other commenters here, it's not just about reversibility, but "useful energy" loss - with lenses you are not paying the needed "efficiency tax" to achieve higher temperatures than the source.


What the thermodynamic principle says you can't do is to take Q units of heat energy from one body and transfer it all to a higher temperature body.

The solar panel system is different - it's a heat engine that takes Q units of heat energy from one body, transfers a portion of it Q₁ to a higher temperature body and transfers the remainder of it (Q - Q₁) to a lower temperature body (the immediate surroundings). That's OK with thermodynamics, subject to upper bounds on the ratio Q₁/Q that depend on the absolute temperatures of the bodies involved.

This suffices to show that the perfect lens system can't work. The dusty lens system can't work because adding dust to a perfect lens doesn't turn it into a heat engine.


> you can't do is to take Q units of heat energy from one body and transfer it all to a higher temperature body

Except it's not "heat energy", it's electromagnetic energy that's getting transferred. If the target object is smaller than the source object, you can have an amount of energy that will increase the target's heat above that of the source.


> Except it's not "heat energy", it's electromagnetic energy that's getting transferred.

Thermal radiation == electromagnetic radiation.


I don't think it could remain above the source temperature in equilibrium, because it will radiate back toward the source!


It's impossible to aim the source at a smaller target because the source dissipates in all directions, unless you make something like a laser.


>If this is ok with thermodynamics then so is using lenses.

It's okay because solar panels are not perfectly efficient while lenses are, in theory, perfectly efficient. That efficiency loss is in essence the "cost" of moving heat from a colder to a hotter place. I'm guessing, but don't quote me on it, that solar panel efficiency is related to the temperature of the sun and the local environment just like any other heat engine.


> solar panel efficiency is related to the temperature of the sun and the local environment just like any other heat engine

You are correct, https://en.wikipedia.org/wiki/Solar_cell_efficiency#Thermody...


You are correct about the solar panels. Solar panels are more efficient when they are colder, i.e. the temperature delta between the sun and the panel is larger.


The difference is that a lens is a reversible system, whereas using energy from solar panels to melt iron is not. If you don't have a reversible system it is perfectly possible to take energy from a cold object and make a hot object hotter, but you cannot do this reversibly.


Both these thermodynamic arguments look good, I don't see any contradiction though

(1) can't heat an object to a higher temperature than a source object using lenses - you can't transfer heat from a cooler object to a warmer object without work and lenses don't provide a mechanism for work. This still leaves room for reflected sunlight from moon albedo getting something hot enough to burn, at least from a thermodynamic perspective.

(2) you can get something arbitrarily hot using lunar panels, this is just a bad heat pump (max COP 1) where you feed work generated by the solar panels into a perfectly thermally isolated system with a resistive heater, and we could do this a bit more efficiently with a true heat pump.


The entropy argument is a red herring for your system (and wrong). The difference is that you would have to store your energy from the solar panel for time t1 and then expected it in time t2 much less than t1 to achieve your desired effect. The system in question doesn't allow storing the energy for later quicker dispersal


No, I do not store any energy. My solar panels outputs are wired directly to the input of an arc welder. Powered from the sun light (surface temperature 6,000C) this system can easily give me an arc of 10,000 C.


Reversible means you don't increase the entropy of the system. If you relax this restriction, you can do useful work, like powering an arc welder or growing carrots to feed yourself and rubbing sticks together to start a fire. Merely making the lens non-reversible doesn't mean it will suddenly set things on fire.


How does it work in practice?

I am dumping some amount of low-energy photons onto a target, the target is heated up and radiating out the same amount of energy it receives. As I increase the number of photons hitting the target its temperature raises and it radiates more heat outwards, shedding excess heat.

Then as I keep increasing the number of photons, the temperature stops increasing at some point having reached the temperature of the photon source. I increase the number of photons a million times of the previous level, yet the object remains the same temperature.

Where does the excess energy go? Is it not absorbed by the target? Is it reflected? Is it re-radiated?


So you have a blackbody source, and a target.

Initially it's in a steady state where the target is colder.

You increase the number of photons hitting the target. The target gets hotter. All good so far.

But we have to look at how you increased that number. You need to either move the source and target closer together, or you have to use lenses to simulate the same thing.

You can keep hitting the target with photons from more and more directions, but no system of mirrors or lenses can increase the apparent brightness of the source.

As you approach the point where the source fills the entire sky, the target will approach the same temperature.

And then you can't increase the number of photons any more. Not without using a hotter source.


The point is, you can never get to the point where you are dumping more photons in to make it hotter, because there are not more photons.

There is a limit to how small you can focus a lens. Once the object you are focusing is in sharp focus, it will start to go out of focus again if you go past that point. So you can’t just arbitrarily increase the photon density further.

If you make your lens bigger to gather more photons, then the focused image just grows by a corresponding amount.

Optics (and thermodynamics) tell us that we can not generate a higher flux of photons into a target area, than left the equivalent area at the source. Assuming that both the source and target are black body radiators, that says the target can’t get hotter than the source.


Consider this:

Say a light source has a T temperature resulting in X photons emitted. I redirect all the photons to a single point. I see arguments mentioning that that single point cannot be hotter than the source because there's no more photons to make it hotter.

I now add a second light source of the same T temperature, that emits the same amount of photons and also focus all of them on that same point. I now have more photons, but temperature source of all photons is the same. How does adding more photons not make my point hotter?


> I redirect all the photons to a single point.

For that to be possible with a blackbody light source, it has to itself be a single point. Which means a temperature of approximately infinity.

For a real light source, one that has an area, you can at best focus it down to the same area. To get maximum light to a target, you either have to make the target almost touch the source, or you have to have it so no matter what direction you go from the target, you hit a lens that's bringing light from the source. Once you set either one of those up, there's nowhere to fit a new light source.


> For a real light source, one that has an area, you can at best focus it down to the same area.

I'm not an optics expert, but this can't be true. You can clearly focus light emitting from an area into a smaller area (although probably not to an infinitesimally small area).


You can focus 1/100 of the light onto 1/100 the area. Or 1/1000 of the light onto 1/100 the area.

You can't increase the density of the light. You can't focus all of it onto a smaller area.

Is that clearer?


In order to truly move all of the photons from the emitter to the target, you need to effectively surround the target with lenses, such that it looks from the surface of the target as if the emitter is filling the sky.

If that is the case, then there is no extra room to fit the optics to focus another source onto the target.

If you shrink the optics enough to make room for another source, then you also aren’t delivering all of the photons from the original source (you can’t be, since you aren’t covering all incoming angles), and therefore it’s not the same temperature as the source.


Well I think the thermodynamic analysis is useful in understanding things, that was my point. I’m not quite understanding the scenario you describe - are you asking how it is possible that you can’t heat up the target more than the origin? If so then take an example where the sun is a single point at 5000 degrees. It is easy to see that the most you could do with this point sun is heat up one point on your object to the same temperature, no more. Now what if there were two point suns next to each other? The discussion on xkcd about optics is saying you can’t superimpose the points on each other on the target under any circumstances, even with two separate lenses at the right angle. This doesn’t seem like it should be impossible though...

What about two single photon sources, can’t they be pointed at exactly the same spot? Maybe the explanation here is that the target electron cannot interact with 2 photons at the same time, so you can’t ‘double heat’ a single particle. Or maybe that you can’t precisely target a single particle without decreasing the entropy of a closed system, which is impossible.


The interesting thing to me is that Randall describes a very similar system just a few What-Ifs earlier: https://what-if.xkcd.com/141/

Here he says very clearly that if you "bundled" all the light from the sun and aim it at the earth it would heat the atmosphere to millions of degrees (the surface of the sun is much less than that). It's not at all clear to me what he means by "bundled" and why it's not contradictory to what he says in the article here. Presumably some kind of lens / mirror system could be used?

It seems to me that in this article he has in mind some highly abstract system that's fully reversible. Of course, in that case, once the target object gets hot enough it will start emitting light and result in equilibrium. But it's not clear to me that this describes what would actually happen with a real optical system! E.g. (a) much of the light the target receives is going to be absorbed and reemitted away from the lens, (b) what if you removed the lens targeting system at the precise moment the light impacted the target, so that the system couldn't be reversed, etc.

Edit: one more thing. The surface of the lit side of the moon can reach 260 degrees F, and dry wood can potentially catch fire as low as 300 degrees F. And the moon has some reflectivity as well. So even taking Randall's claims on their face, I'm skeptical that you could not start a fire (in some materials at least) using moonlight.


Yeah, it does contradict that. You can't really "bundle" all the light from the sun like that and aim it at the Earth without violating thermodynamics. The way to do it would be something like wrapping the sun and the Earth together in a giant chamber made out of some perfect mirror (actually maybe the mirror isn't necessary), with the only exit being the dark side of the Earth. And that would heat up both the Earth and the Sun to millions of degrees.


Aha. I think you've helped me see the central issue with your point that it would also heat the sun to millions of degrees. Suppose there was some system that allowed you to collect the light of some source and dump it somewhere else: the drawing in the What-If seems to suggest a system of perfect mirrors with an "output tube" pointed at the earth, so that any photons leaving the tube hit the earth with a high degree of accuracy. What your point shows us is that even if we imagine an indestructible system of mirrors and lenses to do this, the result is that bottling up the energy that the source normally radiates away massively increases the temperature of the source (turning your mirrors into plasma and returning the system to normal, but we're ignoring that again). So entropy law isn't violated.

The reason I (and probably others) find Randall's explanation unhelpful is that obviously there's "enough" energy being reflected by the moon to start a fire (that's why people keep bringing up solar panels). The issue is that there's no way to optically redirect that energy into a small area without heating up your source to the same degree. Which is theoretically possible I suppose, but it's not the situation the What-If is talking about. Along with the issue that the light we see from the moon is mostly reflected rather than emitted (which changed the situation entirely), this makes the What-If explanation a little misleading.


Solar panels lose energy to heat in this case, so entropy does not decrease.


I was wrong about what I wrote here and am removing it.


>The analogous problem for you would be that you couldn't maintain your electric heat ray (or whatever) for a time equal to or greater than the time you spent collecting.

Yes, yes I can. 1000 solar panels (200 watts each) will power a an electric arc welder up to 20,000 C continuously for as along as the sun shines in the sky.

There is nothing here that collects energy across time, all energy is immediately imparted onto the electric arc.


Yeah, maybe it was a red herring (a black-body herring?) to bring up batteries here.

The solar panel itself is already non-reversible, and so is the arc welder, right? That seems to be the important difference from the lens.

In that sense, we don't run either of them "for free" in the way that the xkcd piece describes.


> I can cover the entire surface of the earth with solar panels to harvest the moon light

Are you sure about that? First off, take an off-the-shelf solar panel, point it at the moon in the middle of the night. You get a grand total of nothing. Okay, it was a cheap panel, that might not generalize to anything.

But more importantly, by the argument laid out in the article, your solar panels cannot work in moon light. (Or maybe they work at horrible efficiency, because the moon is a bit warmer than the earth.) I'm not sure I buy that argument; maybe you should run the experiment.


There is no need for an experiment, the moonlight does produce energy in a solar panel. Obviously the power produced will be much much lower than with the sun. I remember that as a child I was able to use a solar calculator indoor with just the power gathered from an incandescent light.


>(Or maybe they work at horrible efficiency, because the moon is a bit warmer than the earth.)

Yes, this is what would happen.


I'll repeat for the reflexive downvoters:

Solar panels will work in moonlight if and only if you can make fire from moonlight with a magnifying glass.

The answer depends on how good a mirror the moon is. It calls for a real experiment, not a thought experiment. I don't really know which way it will go.


You're not addressing the essential part of my comment. Let me clarify:

I put in place 1000 solar panels and aim them at the sun thus procuring 200kw of power. Next I use it to power an arc welder, producing 10,000 C of heat. Therefore I have used a 6,000C sun to produce 10,000C on earth.

Would a similar setup work with the moon? I don't know. That was not my point, my point was that it certainly is possible to produce higher temperature at the target than it was at source.


Monroe’s explanation only applies to optical systems, ie reflecting and focusing light, which is a thermodynamically reversible process (yes, even with dust. Dust doesn’t make the lens thermodynamically irreversible, it just scatters and absorbs some of the light so it doesn’t reach the lens. The irreversibility is a property of the lens). Solar panels are converting light energy into electrical energy, which is not an irreversible process - it creates entropy, and therefore the rules are different.


Sure it is, just not in real-time. The lense fails because there is no storage mechanism.


So thinking about this some more, the argument here is that if you use “just” a magnifying glass of arbitrary size and shape you can’t do this. I can buy that based on the arguments presented. Basically it says that the moon emits (really reflects but that’s immaterial) F photons per second per square meter, and while you can concentrate that into a very small area, all of those photons will not be enough to raise the temperature (that is input enough energy into the system) to a sufficiently high level. This is partially because you can’t make a small enough point with a single lens, and partially because there just aren’t enough photons. The sun doesn’t care because it has such a high flux that the concentration ends up being high enough for the area.

The area argument is more important here because the lens cannot increase the number of photons per second, but it can decrease the area. If F = N / (t * A) where N is the number of photons, t is time, and A is area, the lens can change the area, but not to 0. And if you need a sufficiently high F to get to the right temperature, the only way to get there with limited N is to bring A sufficiently close to 0.

If you have multiple magnifying glasses and mirrors I am fairly certain that you can. That is the equivalent of using a set of solar panels that power a laser. But that was not what was postulated in the original thought experiment, so it does not apply.

I am still fuzzy on the thermodynamic argument, but I was never good at intuiting thermodynamics. The argument presented is that if you have one body at 100 degrees C, and you put another body next to it, you cannot make the second body hotter than the first. That makes sense. But if the first body is constantly generating and transferring heat to the second with at most 100 degrees C temperature, and the second body has some way to store heat energy, then it is possible to heat a local area of the second body to higher than 100 degrees C. The storage of energy here is what I think counts.


The thing is, radiative processes (like light) can’t store energy. To do that, you need some kind of engine, either electrical or mechanical.

Of course you could put a solar panel connected to a battery in moonlight for a few months and build up enough stored energy to power a laser for long enough to fry something. But that’s not “burning something with moonlight using a magnifying glass.”


Right. That’s what I am saying. Using the original setup of the problem, you cannot use a device to store energy.


Oh god, can we please get a real physicist in here? This entire thread is a mess of computer programmers “well actually”ing other computer programmers and everyone being wrong.


My thoughts exactly. Know the limits of your knowledge, folks. You’re doing the physics equivalent of arguing that you can write a lossless compression algorithm that always reduces the size of the input no matter what it is, or that you can write a comparison-based sort algorithm that’s better than nlogn if you’re sufficiently clever.


The linked article is correct and the arguments are well known and accepted in the physics community and have been for a long time (much longer than Randall Munroe has been alive). A lot of the counter-arguments/speculation here in the comments is wrong.

Reading this is is equivalent to reading a thread on a physics forum where with people arguing about an article saying that O(n*lgn) is really the best possible runtime complexity for a comparison-based sorting algorithm, and trying to disprove it.

It's also worth pointing out that Randall Munroe is a physicist (to the extent an undergrad degree counts anyway).


The discussion here doesn't really bother me. I feel like people are making a good faith effort to understand and are asking questions that help develop better intuition. And this is a good example that helps develop an understanding of thermodynamics. (Another favorite of mine is "why can't you have stealth in space?")

I feel much more frustration when arguing about impractical engineering proposals (e.g. solar roadways, waterseer, hyperloop, or the ocean cleanup project), since people seem to have a much more biased drive to believe in their feasibility, and can't really be reasoned with.


The linked article is correct and the arguments are well known and accepted in the physics community and have been for a long time (much longer than Randall Munroe has been alive).

It's interesting what bothers different people. While many of the statements in this thread are probably wrong, not many of them bother me. But I find the lack-of-self-doubt and appeal-to-authority in your message to be genuinely offensive. Where does your certainty come from?

With that out of the way, could you give some links to well known arguments that you refer to? Specifically, I feel certain that one can start a fire with sunlight reflected from a room temperature mirror, and don't understand the difference between a mirror and the moon within Munroe's argument.

His conclusion might be correct (in practice, you may not be able to concentrate moonlight enough to start a fire) but I don't think the details of his argument can be. I currently don't believe that the temperature of the reflecting surface can be the limiting factor, and I think this is central his argument.



>Specifically, I feel certain that one can start a fire with sunlight reflected from a room temperature mirror, and don't understand the difference between a mirror and the moon within Munroe's argument.

A mirror does specular reflection and thus conserves the etendue of the sunlight. You're concentrating the image of the sun in the mirror, not light from the mirror itself.

The moon in contrast is mostly a diffuse reflector - it scatters most of the light that falls on it (and absorbs and re-emits most of the rest), so it is effectively a new light source.


Yes, I can agree with that. The problem for me is fitting it into Munro's argument. He says (in boldface): "You can't use lenses and mirrors to make something hotter than the surface of the light source itself".

This is true for a black body, but is it always true for an object being illuminated by another? I'm don't know that we can consider a diffuse reflector with a surface temperature of 100C as being equivalent to a black body with temperature of 100C. I think his conclusion is likely true (the moon is too dim to start a fire even with a really big magnifying glass) but I don't think he's right to point to the surface temperature of the moon as being the evidence of this conclusion.

Assume the sun was much brighter, so that ignition on earth is possible with a sufficiently large magnifier. Presumably if the moon was the same, this would mean the surface temperature of the moon was much higher. Now change the moon to be more heat conductive (causing the surface temperature to drop due to more heat loss on the dark side), and more reflective (causing the surface temperature to drop further due to less absorption). I'd guess that if you tweak the parameters sufficiently, you could end up with a surface temperature low enough that Munroe's argument would say that ignition is impossible, even though we've increased the intensity of the moonlight over our baseline.

How does Munroe know that we aren't in that second regime? I don't think there's enough information in his argument to distinguish. Alternatively stated, we know that there is some current temperature to which we can heat an object using concentrated moonlight. We also know that if we can change the shape and composition of the moon, we can reduce the surface temperature without reducing the intensity of moonlight. Unless there is some limit to the effectiveness of the heatsink that we can put on the moon, I think this means there is some possible arrangement that violates the assumption that the surface temperature must always exceed the temperature achievable with a magnifying glass.

(Thanks for helping me to puzzle this out)


An excellent point!

>Now change the moon to be more heat conductive (causing the surface temperature to drop due to more heat loss on the dark side),

Yeah, although this isn't too big in the case of the moon (unlike the Earth, it doesn't have an atmosphere and doesn't rotate rapidly, so there isn't too much redistribution of heat across its surface), it is definitely something that would confound the calculations. We'd still be able to just look at the effective temperature of light falling onto the moon and that would limit the temperature that we could light the object up to. But we wouldn't be able to use a direct measurement of the temperature of the surface of the moon.

> and more reflective (causing the surface temperature to drop further due to less absorption).

To the extent that it's a gray body (and most objects are approximately graybodies), this wouldn't actually lower the temperature. Absorptivity < 1 causes it to absorb less energy from the light, but for a gray body emmisivity equals absorptivity so it also radiates out less light too, and you actually end up reaching the same equillibrium temperature as fully absorptive black body.


Thanks for the kind response. I now see that the final average core temperature should remain the same, but I'm less sure about the surface temperature. There are a lot of complex processes involved, and I'm not sure that one can conclude that everything cancels out to keep the answer constant.

I respond late to offer a link (that's hidden on the second page of this thread) that I think presents that argument I was trying to make better than I managed to: https://physics.stackexchange.com/questions/370446/is-randal.... I thought the comments on the answer were a helpful reframing of the problem.


"Appeal to authority" is a fallacious rhetorical tactic only in an extraordinarily rigorous epistemological context, in which all truths need to be established from first principles. In practice, especially given the information asymmetry between the people that have devoted rigorous study to a thing and the people that think they know about it because making computers do things means they're smart (which is an intellectual pathology that I think ought to bother anyone who cares about the process of modeling physical phenomena accurately and honestly, even if it doesn't bother you), an argument like "physicists uniformly agree on this physical property" is a perfectly adequate assertion in favor of the claims made in the article. It would for sure be bolstered by the inclusion of sources, but you have to admit that you're throwing away a lot of relevant information towards the end of supporting your intuitive sense about how the optics here work. It's not impossible, in the absence of an argument from first principles to counter, that you're correct and that people who have formally studied physics and/or are familiar with the literature are wrong or misguided, but it sure isn't the most likely case by a wide margin. In that context, your certainty ("I don't think the details of his argument can be [correct]") seems much less well-founded.


I agree on most of your points. There is nothing inherently wrong with an appeal-to-authority in this case. If any substantial number of "reliable" physicists have looked at this problem, and if they unanimously agree that there is no way to concentrate light reflected from an object to produce a temperature greater than that object, then my intuition is almost definitely wrong and their conclusion is almost definitely right.

The nice part about physics though (like Munroe, I was also an undergraduate physics major) is that in simple cases like this, a suitable expert can usually defend their position with an argument comprehensible to a nonspecialist outsider. My doubt in this case is not that the experts are wrong, but that the experts haven't actually looked at the details of Munroe's argument and stamped it as "approved".

I think the part I find "offensive" is that CydeWeys is not claiming to be an expert himself, but is claiming to have certainty in what the experts believe. I don't know exactly why I find this offensive, but I do. And yes, this may be a problem with me, and not with CydeWeys' argument. I would not be offended in the same way by someone claiming "I am an expert and I approve this argument". Still, my question to him is genuine: what gives him this certainty?


I'm a real physicist. But I've learned to avoid these debates because I'm not a good enough debater to avoid getting tied up in logical knots.

My version of the argument involves turning it into a perpetual motion machine.

On the other hand, this rule is incredibly useful in modeling energy transfer through optical systems, seeing what things are possible and what things aren't without having to make detailed calculations. It's a real world rule with useful applications.

Instead, I'll just spout nonsense about programming. ;-)


The article is basically correct but doesn't fully explain all the concepts it touches.

So you should really be asking for a communications expert, not a physicist.


It's just like the old USENET days. It's also how Slashdot used to work. Enough of us programmers would spout enough nonsense to make a real expert angry enough to inform the heck out of us.


Real physicist here. The linked article is correct.


Physicists here.

He's trying to explain everything in terms of a simple blackbody in thermal equilibrium, peacefully radiating its energy away only via thermal photons. That's not the reality of the radiation from the sun or moon. Solar physics is an entire branch of physics, and such simple toy models are not even wrong.

Sun doesn't just radiate away its existing energy via thermal photons.

First, it keeps burning its fuel via a series of nuclear reactions, which by the way keeps pumping energy into the system, essentially acting like a battery (so there's no perpetual motion here).

Second, sun emits photons that are much more energetic than the thermal photons from the surface. Some of the radiation is not thermal, and comes directly from different types of nuclear reactions (which provides signatures regarding the kind of reactions happening in the sun) and various other processes.



So what about non-linear optics and optical frequency doubling?

https://en.wikipedia.org/wiki/Second-harmonic_generation

It was just an option on my MSc in LASERs but I thought it was cool (with the potential to be very warm).

Although my frequency has been halved and I've been working in software for decades


I thought of that as well. Once you have a different frequency you can also do more bizarre fun things. http://optics.org/news/7/1/11


I don’t buy the thermodynamic argument. Here’s a version I would believe: if you have a gadget that, exposed only to the sun and to empty space, heats some target hotter than the sun, then that gadget must not work if you take away the empty space part. This is because your gadget could be used to drive a heat engine, which is impossible without a temperature difference, and the sun is more or less a blackbody emitter. Lenses and mirrors aren’t magically taking advantage of the cold parts of the sky, so there you go.

But the moon is not blackbody, and I think the whole argument falls apart. Here’s a thought experiment: go stand on the moon, and assume the moon is made of rock that diffusely reflects, say, half of the indicent 500nm light. Stand somewhere that’s in shadow, so you can’t see the sun. Wrap a piece of paper and some air in perfectly insulating, perfectly reflecting material, except that the material lets 100% of 499-501nm light through, but only on the moon side. The target will be in a bath of 499-501nm light at 1/2 the intensity (energy density per unit volume) of the sun, which is far more than half the temperature of the sun. It’ll catch fire after a while.

Now do the same experiment on the Earth, at night, with lenses to bathe it in moonlight from all sides. Fire! So I claim that lenses+mirrors+filters can start a fire with moonlight.

Another interesting question: can you use a luminescent solar concentrator or other fluorescent material to pull this off without taking such egregious advantage of the spectrum of moonlight? These types of materials can violate conservation of étendue.


> The target will be in a bath of 499-501nm light at 1/2 the intensity (energy density per unit volume) of the sun, which is far more than half the temperature of the sun. It’ll catch fire after a while.

Unconcentrated sunlight is slightly under 1400 watts per square meter. It's equivalent to a temperature of 122C.

You can concentrate sunlight coming from the sun, because the sun only fills five millionths of the sky. With a simple lens you can focus hundreds of megawatts per square meter onto a surface.

But once you bounce that light off a diffuse surface, whatever concentration you had becomes the new maximum.

In your experiment, bathing something in moonlight would max out at 700 watts per square meter.

700 watts per square meter doesn't set things on fire. It can only heat a blackbody to 60 degrees C.

Even the full brunt of unaltered sunlight can only bring a blackbody up to 122C.

-

Treating the moon as a blackbody or not doesn't actually change the equations. The important property is that it diffuses light. It resets your maximum concentration of light, because light that comes evenly from every direction can't be concentrated.

(I'm ignoring the part about wavelength filtering because it's confusing and would only make your piece of paper heat up less.)


> 700 watts per square meter doesn't set things on fire. It can only heat a blackbody to 60 degrees C.

> (I'm ignoring the part about wavelength filtering because it's confusing and would only make your piece of paper heat up less.)

I’m afraid you’re ignoring the interesting bit. You say that 700 W/m^2 doesn’t set things on fire. This is not true. Sure, 700 W/m^2 applied to some target that is allowed to radiate its own blackbody light out to the sky won’t get it very hot, but that’s not what I’m suggesting. I’m suggesting that you insulate the target very well so that its blackbody emissions don’t escape, but you let in the short-wavelength moonlight. Thermodynamics requires that you also let out the short wavelength blackbody emissions, but those are negligible until the target gets very, very hot.

This effect isn’t science fiction — it’s just the greenhouse effect, amplified. Greenhouses (the glass ones and the atmospheric ones) exploit the fact that sunlight doesn’t match the Earth’s blackbody spectrum, so a filter (glass or gaseous) can allow incoming radiation in but trap most outgoing radiation.

In effect, I’m suggesting that a very good greenhouse plus some lenses could get hot enough to start a fire.


I see. I couldn't quite follow which parts you were saying to insulate, and your mention of "half the temperature of the sun" lead me to misinterpret. That's fine then, I think. Not an expert on wavelength filters.


The article reasoning can be shortened to observation that passive optical system does not change the wavelength of photons and to trigger a fire the wavelength has to be short enough.

But the conclusion of the article is wrong. The surface temperature of the Moon has very little to do with the wavelength of the reflected photons.

Consider a surface covered with ideal tiny mirrors each pointing to random direction. Only a tiny proportion of these mirrors will reflect light from the Sun towards observer. Now consider that 90% of those mirrors are painted black reducing the reflected enrrgy flux by further factor of ten. The Moon is like that.

Still the reflected light has original wavelength of the light of the Sun. Collect enough of it and that triggers fire.


That's incorrect. It has nothing to do with the wavelength of the light.

> Still the reflected light has original wavelength of the light of the Sun. Collect enough of it and that triggers fire.

You can't collect enough of it into one place with a passive optical system, because it's been irreversibly scattered by the moon's surface (Read: irreversible increase of https://en.wikipedia.org/wiki/Etendue).


Yes, I stand corrected.

Essentially an optical system will bring Moon's surface closer, but even if it brings the surface within 1 cm from the wood, the defused Sun light scattered from that surface is not enough to ignite the fire.


Well, this doesn't look right to me. Imagine that the moon is actually a filter at the path of the sunlight.

Sun's temperature is 6000 K. Moon's surface is pretty black: it reflects only 12% of the light.

So, effective temperature of the Sun reflected by Moon, considering that thermal radiation is proportional to T^4, is 6000 * .12 ^ (1/4) ~= 3500 K. That's quite enough to light up some fire! Of course, the spectral composition of the light will be not thermal etc, but the estimate should be close enough.

Why doesn't the Moon itself heat up like that? Well, the rocks on Earth don't heat up to 6000 K either... I think, it's partly that they are "not surrounded by sun", partly that the Moon is a giant cold heatsink


I'm a bit skeptical of the conclusion and the reasoning process used to arrive there.

For example, the article states "In other words, all a lens system can do is make every line of sight end on the surface of a light source, which is equivalent to making the light source surround the target."

If you forget about optics for a second, imagine that the outer surface of the sun were wrapped around a point (think of the image shown in the article). If you consider conservation of energy for the energy flux from the surface of the sun being entirely directed to a single body of matter that absorbs this heat (assume it's a penny), the steady-state blackbody emission of the penny would have to equal the energy flux from the entire surface of the sun. I think this situation would end up making the 'temperature' of the penny much higher than the surface of the sun for the same reason that the center of the sun is hotter than the surface: There is energy expended, and it comes from the fusion of light elements inside the sun, so there is no violation of entropy as stated in the article: "you'd be making heat flow from a colder place to a hotter place without expending energy."


That doesn't work because when the penny is hotter than the sun it radiates an equal amount of energy back to the sun.

The sun is a blackbody, it radiates light because it's hot. Once the penny is the same temperature it will radiate back at the sun until the two are in equilibrium.


If the size of the penny was reduced to just a few atoms, wouldn't changing the number of atoms in the penny affect the rate at which the penny itself could emit radiation through blackbody emission? If so, I would expect there would be a size at which the penny could be small enough that it's temperature could grow arbitrarily large once the heat flux in = heat flux out from the penny.


I don't know very much thermodynamics, but I'm not sure that thermodynamic temperature is well-defined for a system consisting of "just a few atoms".


A penny at a temperature high enough to radiate the same amount of energy as the entire surface of the sun, would be at a far higher temperature than the sun.


If you surround the penny with sun with no gaps, all the light hits the penny.

If you add more sun, you add gaps, and the amount of light hitting the penny stays constant.

It never has more than one penny-surface-area of light hitting it, so it never has to get hotter than the sun.


Thanks, this is what allowed it to click for me.


I think the answer here is that you can't do this. I don't think you can build such a system to redirect all the light from the sun to a surface with a smaller area than the sun because of "conservation of étendue".


You cannot use a magnifying glass to make something hotter than the source of the light. The thermodynamics principle is unyielding.

This is a tough tricky thermodynamics question because it really seems like you can use a really big magnifying glass to make say an object that is hotter than the sun. I remember working through it in a physics class.


If you directed all the input to the penny, then you've created a close system that can't lose energy, and doesn't the whole system - including the sun - keep rising in temperature anyway?

Ultimately the outside of this inverted sun system needs to vent the whole of the energy, the energy can't just fall inwards.

What you're describing is a reversal though; it requires the penny to be much hotter than the sun, only as a result of the sun's energy flowing into it.


Isn't the bigger problem that sunlight is nearly parallel and moonlight is reflected off of a spherical surface?

How are you violating conservation of energy if you're taking all of the light that would hit a square mile of the earth and concentrating it down to the size of a penny?

If you can't concentrate light that way then how do focusing lenses on cutting lasers function? Makes no sense.


Indeed, you can not focus all of the sunlight that would hit a square mile of Earth down to a penny. You just can’t. When you focus the sun, you’re not making an infinitely small spot, you are making a tiny image of the sun. The bigger your lens, the larger that image. You can’t get any more focused than “in focus”.

When you focus a cutting laser, you are imaging the shape of the laser cavity. The emission is coming from a very narrow spatial region, so you can focus it back down to a small spot. However, a laser that is out of alignment, will often not be able to be focused to a small spot.


It's not because the moon is a spherical surface. If that were the issue, there would be a bright spot on the Moon where you can see the reflection of the Sun (you can see such specular reflection spots on e.g. many cars), and you could use that spot to light a fire. The issue is that the Moon barely reflects any of the light that falls on it. Most of the light is scattered, and most of the rest is absorbed and re-radiated.


The scattering seems it should cancel out. As far as total energy reaching a point on earth (or the circular disk of a lens) just as much should get to you only due to scattering as failed to get to you due to scattering.

I don't think this changes the ultimate answer to the question.


It’s not parallel though. The sun and the moon subtend almost exactly the same angle in the sky (which is why solar eclipses work).


I don’t buy the argument that you can’t concentrate two beams on the same spot. Sure you might not be able to do that with one lens, but concentrating on a small area is as good an approximation. But if you require that it really be a point, why can’t I do that with N lenses and mirrors that are fully reversible but all align to aim at the same point from different angles?


The argument is that you can't concentrate two beams to hit the same spot from the same direction.

So yes, you can have a bunch of lenses each focusing from a different direction.

And if you do a good job of aligning them, each lens will look as bright as the sun/moon from the target.

But that's your limit.


Don't all the biggest telescopes on earth use multiple reflectors these days?


I'm not so sure about this.

Imagine you had a large cloud of planar mirrors, each, specifically can be aimed at any given point --even points that overlap.

While I agree that you cannot focus a whole image to a smaller area than the diffraction limit allows for a continuous lens surface, if you omit diffraction, mirrors could certainly do it.


You are describing a parabolic concentrator.


> Lenses and mirrors work for free; they don't take any energy to operate.

Wait a minute. Does that mean that I could get a tiny solar panel and light it up with a lens, instead of getting big panels? Energy output of a panel is proportional to the amount of light that hits it, right?

I guess that lenses are ‘free’ only if they have no impurities, but even then, assuming solar panels are costlier than plastic lenses, I could save money.

Come to think of it, how is it that I haven't seen or heard about parabolic reflectors with solar panels in the focal point? Right now I've found an article about parabolic troughs that are apparently used to heat old-school fluids instead: https://en.wikipedia.org/wiki/Parabolic_trough



Making solar cells that effectively can handle concentrated sun light is difficult (if they heat up efficiency goes down, ...), so it's easier to just fill an area with solar cells instead of with mirrors pointing at a smaller cell area.


> if they heat up efficiency goes down

So you're saying I should cool them with water and use the steam to move turbines, then I'm golden!

\($ ∇ $ )/


I'm not sure if you are joking or not, but something like this is still used for solar power generation.

https://en.wikipedia.org/wiki/Solar_power_tower


There are companies that do this. They build tiny solar panels out of materials that can withstand high junction temperatures, and use fresnel lenses to concentrate light onto them.


Solar panels don't like heat.


Randall might be correct for conventional optics, but what about metamaterial lenses that break the diffraction limit? https://en.m.wikipedia.org/wiki/Superlens


I think They missed something. If the max temp you can get from moon light is 100c that doesn’t mean you can’t start a fire. You just need something that has a very low ignition temp. There must exist something that can start a fire at this temp.


Maybe: https://en.m.wikipedia.org/wiki/Carbon_disulfide

Doesn't appear that many substances have auto ignition as low as 100c.

If you're hunting for easy to ignite stuff, it might be better to go for low flash point stuff, and strike a spark?

Eg gasoline will work in pretty cold environments with a flashpoint of - 43c.


There is a very interesting image in the article. It shows a bunch of light coming into a block and emerging as a beam. It also has a caption saying this is impossible.

It struck me as very similar to the setup in this video. https://youtu.be/awADEuv5vWY

At about 4 minutes in, a nearly identical setup is shown with a beam of light emerging from a block of opaque material using holographic techniques.

It seems plausible to me that a specially designed anti-moon hologram could allow reconstruction of the incident light from the sun, thus allowing a fire to be started without violating any law of thermodynamics.


I think I finally realised what rubs me the wrong way about these xkcd musings, and the related discussions.

[Start rant]

They take the situation to its absurd conclusion, then quit with a full finality that people take as truth. Further absurd conclusions are ignored, and their word is law.

Some more absurd arguments for why you will be able to light a fire with a magnifier and moonlight are as follows:

1: use a pre-magnifier to create a spot on the moon with the same temperature as the sun, then magnify the light from that spot to start your fire.

2: wait a long time... eventually a meteor will hit the moon creating a spot bright enough to focus.

3: wait even longer... Eventually the random mollecular collisions between the wood and the (presumably oxegen rich) air around it will convert it into carbon dioxide and water while serendipitous individual high energy blackbody photons help break it down.

4... insert absurd^4 answer that brings in tunneling effects, or moving mirrors, or some other "impossible" reason that is only impossible because they didn't think of it for you. [End rant]


I'm not buying this. The Moon is only a reflector, not a producer of light. The temperature of the light is that of the reflected source. Moonlight is sunlight, more or less. The fact that the moon reflects poorly is compensated by the size of the gathering lens or mirror.

Suppose we build a 100 foot mirror which reflects 10% of sunlight, such that the spectrum remains the same. We could still make a fire with the reflected light, if we just gather 10 times more of it with a larger lens. We could do this in the Arctic, with the mirror's temperature at below zero; the mirror's temperature is irrelevant.


What matters is that we have intensity with quadratic falloff. Light from the sun has quadratic falloff based on the distance from the sun and hence can't be bent to become more intense than at the suns surface. Similarly light from the moon has quadratic falloff based on the distance from the moon and hence can't be bent to become more intense than at the moons surface. If you put a mirror on the moon, then the light from it will have quadratic falloff from the reflected sun and not the moon, which is why it can be used to heat to sun level temperatures.


Yes, since the moon is a scattering reflector, in fact the inverse square dropoff of moonlight is based on the distance from the moon.

Note that the article claims that no matter how much moonlight we are able to gather (i.e. we are allowed to overcome the inverse square law however much we want) we cannot create a temperature that will ignite paper.


You can, if you remember "the scientific principles of the convergence and refraction of light."

"The scientific principles of the convergence and refraction of light are very confusing, and quite frankly I can't make head or tail of them, even when my friend Dr. Lorenz explains them to me. But they made perfect sense to Violet."

Violet Baudelaire goes on to use the scientific principles of the convergence and refraction of light to set fire to a piece of sail cloth using only moonlight and the lens from a spying glass in "The Wide Window" by Lemony Snicket.

It's possible that this book is a work of fiction.


This explanation does not "click." I can't say whether it's technically wrong; I just note that it lacks some of the features of a successful explanation. First time I've seen a dud from Randall.


Here's a thought experiment to consider. Imagine creating a lens to focus the blackbody radiation from a stack of bricks onto a single brick, and heating up the brick.

Now focus the light back onto just a small region of the brick pile, and heating up that region, which in turn heats the rest of the bricks by conduction.

This in turn increases the amount of heat collected from the pile, ad infinitum or until the brick pile melts.

Short of letting the brick pile melt itself, imagine tapping into the excess heat and using it to power an electric motor for a useful purpose.


The problem is that some of the radiation from the moon is not blackbody radiation.


And what of materials that burn at a temperature below the temperature of the Moon? If 100C is the limit, there are materials that burn at much lower temperatures such as Phosphorous (34C).


I mean you can just take white phosphorous to a warm place and let it spontaneously combust.

The use case would be materials that don't burn at the Earth's surface temperature, but do burn at the Moon's peak surface temperature. But you could probably get those hot enough just by rubbing them or something.


One has to coat the material to be set on fire with something that has low infrared absorbance (and thus low cooling through heat radiation) but high absorbance for low wavelength (blue/ultraviolet). This is called selective coating.

https://en.m.wikipedia.org/wiki/Solar_thermal_collector

Combined with the lens, this might work.


I don't really follow this argument, and I would like to.

I think one thing which would help me develop an intuition for it would be to see the calculation of the lens size for heating a one square-centimeter area on the earth to as high a temperature as possible by the light of the moon, and what that optimal temperature is.

Anyone reading for whom this is straightforward? Even a description of how to do the calculation would go a long way.


You get the highest temperature by reflecting moonlight from every angle.

For a single lens you just want something really big. Make it take up 90+% of the sky from your target.

For temperature, the no-calculation way is to measure a rock on the moon (article says 100C) and use that number for how hot you should be able to get.

The calculation way goes as follows: Near earth you get 1400 watts per square meter of sunlight, so if that bounces perfectly off the center of a full moon and gets through the atmosphere with no losses, your target will get 1400 watts per square meter. That's equal to a black body at 122C. After taking into account the spherical shape and atmospheric losses you might get less than half of that, so ambient heat might drown out your results.


> You get the highest temperature by reflecting moonlight from every angle. I see the thermodynamic principle at work, here, but I don't really understand how it's operating at a mechanistic level. Is it possible to demonstrate that assertion optically?


Take your target and trace a ray in every direction away from every point on its surface. The more of these that hit the energy source, the more energy you're getting. And 100% is obviously the best you can do.


From an optical perspective, how does the argument dismissed at the start of the OP break down? Is there some reason the moon's light, gathered from a lens covering hundreds of acres, carries insufficient energy to light a fire in concentrated form?


The optical argument goes like this: Measure the brightness (in energy per square meter) right at the moon's surface. No matter what you do with lenses, you can't concentrate moonlight beyond this level.

You can make an enormous lens where all the energy coming off a particular acre of moon goes through it. But you can't focus all of it onto a spot smaller than an acre, no matter what you do.


Thanks for the explanations.

The current top comment here seems to agree with my original intuition, though.

https://news.ycombinator.com/item?id=18739120


As a few people pointed out, the reflection off the moon ruins the étendue, so there's no concentrating it beyond what all the rocks on the surface already experience.


Thanks again for your help. I think that led me in the right direction.

This is the optical argument for maximum concentration given conservation of étendue (the same page has an optical argument for the conservation):

https://en.wikipedia.org/wiki/Etendue#Maximum_concentration

The angle subtended by the moon is approximately 0.54 degrees, or about 0.01 radians, so the maximum concentration factor is about 10,000. The moon provides about 0.1 lux of illumination, so the maximum illumination you can achieve by optics is about 1000 lux. The sun subtends about the same angle, and we receive 30,000-100,000 lux from it, and the maximum illumination you can achieve from concentrating it optically is about 1B lux.

I'm willing to believe that you can't light a fire from the moon, if the intensity's a million times lower.


Reading the comments I found something I don't understand. What is the difference between 1.black body photons and 2.laser photons An object will heat only up to the original temperature of the black body source in the first case, but to an arbitrary high temperature in the laser case...


Black body photons are coming at random times. Laser photons are coherent, so are timed. Think of swing - if you try to push it randomly you'll get it swing as far as hardest push. But if you push periodically, you can swing it very far with small pushes.


That's not important at all; what's important is that black body radiation has a fixed maximum flux -- its spectrum or lack of coherence isn't why you can't heat another body to a greater temperature. It comes back to etendue, or if you prefer the 2nd law.

You could reproduce any fixed black body spectrum (to arbitrary accuracy) from a set of thermal sources and filters (or a set of lasers, LEDs, etc. with random phases) to arbitrary fluxes just like a laser has, and use this light to heat objects to arbitrary temperature. But if the original emission is of black-body type, you cannot -- the flux is given by the quantum mechanical process and a function of local temperature only. From then it follows from etendue conservation you cannot achieve higher temperatures.


Interesting, this makes more sense. Now I have to read more on the conservation of entendue.


Thanks. So the laser is really special; because it is coherent it can be absorbed and transformed into heat without limit. <br> For random photons, no matter the light intensity, there is a limit temperature of the receiver, which depends on the photon spectrum... <br> From this point of view (energy transfer) are there more than these two kinds of light?


But won't two hard pushes in a row send you flying? I would have presumed you could find the mean and standard deviation of both the energy of the pushes and the frequency of the pushes and you could build a model to find the expected height.


I had the presumption that you could make a mega hot spot with a large enough lens back in the 90s. Fortunately usenet set me straight. It was a long and facinating journey to understand all the whys about it.


It is a truism: no matter how many times this is explained, some engineer will come up with a complicated system that violates the laws of thermodynamics and refuse to admit that their idea is extremely unlikely. The laws of thermo are some of the best understood and most well-supported physical systems that humans have yet invented. Every time somebody comes up with a perpetual motion machine, it gets shot down because the person who invented it literally ignored all the really well-understood math and physical theory in thermo.

it's like there's a brain bug where engineers think they can outsmart 200+ years of scientific progress with a clever arrangement of mirrors.


It is a truism: no matter how many times this is explained

What exactly is the "this" that you refer to? I don't think the issue is that the "engineers" disagree with the physical principles, rather they tend to disagree that the physical principles apply in quite the way that the author claims. Many of the engineers probably believe is that Munroe is correct in claiming that on cannot start a fire with a low temperature blackbody radiation source regardless of the size of one's magnifying glass, but disagree that it is reasonable to consider sunlight reflected by the moon as fitting this model.

Presumably, you agree that it's possible to start a fire using a magnifying glass using sunlight on earth. I'd guess you also believe that it's possible reflect the light from small handheld mirror into the magnifying glass, and still start a match, even though the mirror is much lower than the temperature of the sun? While the specular reflection from the mirror is different than the diffuse reflection from the moon, one might note that the words "specular" and "diffuse" don't appear in Munroe's exposition. Would you agree that Munroe's argument would appear to prohibit this behavior?

Now assume that the moon was replaced by an equally sized parabolic mirror aimed to be focused on the earth. Would it be possible to light a match using this light if one's magnifying glass was large enough? Which thermodynamical principle am I violating in thinking that it might be possible? And which part of Munroe's argument do I invalidate by making these modifications? Again, my point isn't that Munroe's conclusion is wrong, just that there might be something flawed about the argument he uses to reach that conclusion. This might be a "brain bug", but from the inside it just feels like an attempt to understand truth.


The this I refer to is "the principle of etendue": https://en.wikipedia.org/wiki/Etendue

and I agree it's counterintuitive and it's OK for people to come up with ideas, but when you hit the point of "hey, the thermo people have a nice collection of proofs demonstrating this, and it fits very well with the underlying theory, oh, and if you do manage to violate etendue, you could probably build a perpetual motion machine", if you willingly continue to argue and get shot down, it's time to go back and re-read the books.

BTW, what's your obsession with Monroe? What he's referring to is a scientific phenomenon, Monroe is just a science popularizer, and if he got the details wrong- well, the point of xkcds like that is more to inspire people with ideas, than get the exact details right.


The this I refer to is "the principle of etendue"

Great principle. I agree with it, and think that most of the people offering objections here do as well. Our question is whether it's it's being applied correctly in this case. For me, the sticking point is whether surface temperature can be used as a proxy for the brightness we care about.

BTW, what's your obsession with Munroe?

I have none. He's a great explainer, probably occasionally gets details wrong, and (so far as I can tell) is a positive force for spreading scientific understanding. My "obsession" seems to be that I have poor tolerance for overly broad smackdowns of critics. If you are going to tell someone else that they are wrong (as opposed to Munroe who is trying to explain what he believes is true) I think you have a higher obligation to get all the details right. I presume I'm sensitive to it because I'm often on the receiving end.

How about you? Why does it bother you so much that some people say that Munroe's argument is logically flawed?


I don't have any problem with people complaing about Monroe's argument. I am fairly certain he wrote what he did in consultation with optics engineers, and then modified it so it was intellectually comprehensible by a nerd-but-not-optical engineer audience.


Thanks for the response. If you are still interested in this, there's an useful link on the second page of this thread to a discussion of the same question: https://physics.stackexchange.com/questions/370446/is-randal.... If you expand the comments on the answer, the back and forth presents most of the argument I would make, in a clearer form than I could muster.


I've often wondered how big of a lens you would need to grow a plant using a light source from outside of the solar system, if it's even possible.


Is it possible to light a fire using sunlight during an eclipse?

If not, at what percent totality does it become impossible?


Should be possible. Your hot spot under the lens is just as bright as always, it's just smaller, and crescent shaped. As to percent totality, that would depend on how fast your wood/fuel was dissipating heat.


One thing that makes me uncertain about this is the fact that the sun is generating the light from a fusion reaction, thus expending energy. In other words, the system entropy does not decrease because the fusion reaction makes up for any lost entropy by the cold-to-hot temperature flow. This is the same reason why a system consisting of a battery and a fridge would work.


I'm pretty sure if I were to go pick up a number of magnifying glasses and focus them on the same point using moonlight, the temperature at that point would increase with every additional magnifying glass.

Am I to accept that the additional magnifying glasses would cease increasing the temperature once the temperature matched that of the moon's surface?


I don't know enough about physics to judge whether Randall's argument is correct, but in general it is possible for a sequence to be strictly increasing yet bounded.

For example, 1/2, 3/4, 7/8, 15/16, ...


You can't focus light from a single source with just magnifying glasses at different positions onto the same point. You can try it with the sun, get two magnifying glasses and try to focus them on the same point at once, it's not possible.

With the addition of some well placed mirrors though, that's possible.


this article seems to refute some aspects of xkcd's answer:

https://physics.stackexchange.com/questions/370446/is-randal...


This is indeed a great link (expand the comments on the answer), and shouldn't be languishing at the bottom. Thanks for digging it up!


I love xkcd, but this is completely wrong.

It is well known that the spectral temperature of the moon is about 4000K. See for example : http://www.lumec.com/newsletter/architect_06-10/the_sun_the_... or https://physics.stackexchange.com/questions/244922/why-does-... . That is the maximum temperature that you can achieve with light from the moon, no matter how concentrated. 4000 K is plenty hot enough to start a fire.


Color temperature is a representation of the shape of the spectrum as compared to a blackbody of the same temperature. It is not necessarily representative of the surface emitting the light. Because the moon absorbs some of the wavelengths of light more than others it shifts the spectrum to represent a blackbody object of a lower temperature. So depending on what wavelengths are absorbed the color temperature can actually be shifted up or down.

For another example, the color temperature of standard incandescent bulbs is ~2700K, and this is similar to blackbody radiation. However, LED lights can have the same color temperature, but the surface of the LED is not at 2700K because the light emitted is being created through a different process than blackbody radiaton.

That said, I am still wrapping my head around the xkcd explanation because blackbody radiation of the moon at 100K would not be enough but there is also reflected sunlight at a .16 albedo.


Spectral temperature is irrelevant. It's effective temperature that matters. https://en.wikipedia.org/wiki/Effective_temperature


Randall has a response to this argument in the article.


Does he though? Here's all I see -

> "But wait," you might say. "The Moon's light isn't like the Sun's! The Sun is a blackbody—its light output is related to its high temperature. The Moon shines with reflected sunlight, which has a "temperature" of thousands of degrees—that argument doesn't work!"

> It turns out it does work, for reasons we'll get to later.

But the rest of the article is about etendue, and I don't see how the issue of reflected light is addressed (though possibly the answer is implied with an etendue argument I missed).

I'm very curious now - it seems to me that if the Moon was a perfect mirror, you should be able to start a fire with it. Maybe the Moon's low albedo is the reason? Fun fact about the Moon: its albedo (roughly the fraction of incident light it scatters) is about the same as asphalt - not very reflective at all! [0] It just looks bright to us because the Sun is so tremendously bright.

[0] - https://www.lcas-astronomy.org/articles/display.php?filename... , also learned this in an astronomy class


EDIT: I'm not sure about this anymore.


I think TFA refutes your argument, but I'm not smart enough to know for sure. Can you please expand on this?


What if the sun was a bunch of parallel laser beams? Then you could focus them all into a single point.


Right, but that point would be exactly as hot as the surface that the light was coming from, and no hotter.

(Is the argument as I understand it).


If this is true, how do laser cutters work? The focused beam can melt through steel, but surely the temperature of the gain medium is much lower than the melting point of steel?

But the thermodynamics argument seems like it makes sense too...


Lasers have negative temperature. The light emitted from the sun is nothing like laser light.

https://en.wikipedia.org/wiki/Negative_temperature#Lasers


Like the other commenter said, lasers are not black-body emitters. They "pump" light out and are very far from equilibrium.


Wouldn't that kind of make the National Ignition Facility [0] impossible?

0: https://en.wikipedia.org/wiki/National_Ignition_Facility


From what I've read in the other comments, the argument doesn't apply to lasers.


Lasers are kept far from equilibrium by design, unlike blackbody radiators which are in equilibrium with the electromagnetic field they are radiating in to.


On second look, the thing I thought of doesn't work, so I retract the statement...


This is great, I always love the xkcd "What if" explanations. I even have the book sitting on my desk next to me.


You can with magnifying glass bigger 2.3 million times




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: