Hacker News new | past | comments | ask | show | jobs | submit login
Running an LED in reverse could cool future computers (phys.org)
46 points by ChrisGranger on Feb 17, 2019 | hide | past | favorite | 27 comments



The stated theoretical limit of 1000 Watts/m^2 does not sound much when compared to the TDP of contemporary chips.


Perhaps a stupid question: is this like photovoltaic cells, but in the thermal part of the spectrum?


Actually as far as I know, photovoltaic cells are actually "common" diodes. And every diode actually emits light while forward biased, it's like a inherent property of them. Electrons "leap" the electron gap in a NP junction and while "going down" emit a photon. And the reverse is true, a photon of certain wavelength inciding on a NP junction will energize an electron if the wavelength coincides with the electron gap, thus producing a voltage difference.

This is used in more interesting ways on LEDs since their construction exposes their junction to the environment. You can use their small junction capacitance while they're charged and it's variability while being exposed to photons to create "light based touch sensors". You emit the same wavelength light with contiguous LEDs and one of them is "turned off" and time it's discharging rate. Faster? Light is shining over the junction. Lower? there's no matching photons over the junction.

So, I guess yes, is like PV cells but for even wider wavelengths (AFAIK PV cells already try to harvest the most of the sun's infrared light)


How does this differ from a solar cell? (would make sense to capture heat/IR with one and just feed it back as electricity).

Or did they just happened to use an LED and the same thing could be done with a solar cell (feeding some power the wrong way to make it suck in a bit more)?

So it's a bit like a solar cell with a motor as a load, but you even spin it in reverse to make it suck dry the solar panel a bit more than it would be willing to give(?).



Yea, this is why I wondered why not use a solar cell right away.


It sounds like by "in reverse", they mean they're applying current in reverse to the LED, not using it to convert infrared light into electricity, as with photovoltaics. What's not clear to me, though, is where the thermal energy actually goes.


I definitely think generating electricity is the way out for the energy/heat. I think they just apply reverse voltage to drain the LED actively.


Not sure I quite understand how this works. The heat / infrared emissions it absorbs has to go somewhere right? I haven't seen any mention on that matter in the article.


(Disclaimer: I have some background in physics, but I'm not an expert.)

I think the heat is transferred from the "computer" to the LED itself. So the LED itself heats up.

To go into a bit more detail:

All physical objects emit thermal radiation (a.k.a. black-body radiation [edit, not quite correct]). The hotter the object is, the more radiation it emits and the shorter the wavelength of the radiation. For "red hot" objects, the radiation is in the visible part of the spectrum; for room-temperature objects, it's in the infrared, so you can't see it, but it's still being emitted. If you put two objects of different temperatures next to each other, they'll both emit radiation and absorb each others' emitted radiation. The hotter one will emit more radiation than the cooler one, so there will be a net transfer of energy from the hotter one to the cooler one (normally).

But in this experiment, when the researchers run the LED in reverse, this somehow suppresses the LED's normal thermal radiation emissions. The "computer" continues to emit thermal radiation, which is absorbed by the LED just like normal. But the LED doesn't emit thermal radiation back to the computer (at least, not as much as it normally would). So even though the LED is hotter than the computer, the LED acts like it's colder for thermal radiation purposes. So there's a net transfer of energy from the cooler "computer" to the hotter LED.

Of course, this isn't a complete solution; you still have to do something about the heat once you've transferred it to the LED. The article doesn't address this part of the problem. But I speculate you could just attach a heat-sink to the LED. Heat sinks work better at higher temperatures, so the computer+LED+heat-sink solution would be more effective than a computer+heat-sink alone.


I agree with you up to the last paragraph. It sounds to me like the led is reverse biased like a silicon particle detector. A normal led works by combining electrons and holes to make photons, in this case hole-electron pairs are made by light. The reverse bias causes them to drift apart, resulting in a small current opposing the bias.

For this reason i would interpret the power supply producing the bias as the load, and the led as the power supply. This is how energy is removed from the system.


> All physical objects emit thermal radiation (a.k.a. black-body radiation)

Not all materials are black body emitters. The black body spectrum is in fact an idealized emission spectrum which only few materials approach.

An LED most definitely is not a black body emitter.

You can engineer materials that absorb/emit a specific part of a spectrum and reflect/don't emit in other parts. This has uses in thermal engineering - e.g. you can shed heat by making things "white" in most of the spectrum but "black" in the infrared windows of the atmosphere. In this case they tune the behavior electrically.


True, but the distinction is irrelevant in this case.


Far from it. If all physical objects were black bodies then the LED couldn't do anything here since emission and absorption are the same process.


The article notes that the LED "acts as a very low temperature object". I interpret this to mean "the LED radiates as though its temperature is lower", rather than "the LED radiates as though its emissivity is lower". In other words, AFAICT the effect can still exist even if the LED has 100% emissivity in both absorption and emission, which would make it a black-body. So the fact that the real LED's emissivity is not 100%, is not relevant to the effect. But we're splitting hairs here.


The paper is talking about effective temperature for specific photon energies. Additionally black body emission refers to thermal equilibrium conditions which the LED clearly is not. And then there's the near-field coupling they're using:

> Recent experimental advances in near-field radiation have shown that heat-transfer rates on the nanoscale can exceed the blackbody limit by several orders of magnitude14–17, owing to contributions from evanescent and surface modes18,19. As a result, energy conversion rates can be greatly enhanced on this scale20.

So this whole system relies on not being a black body on several levels.


Even that will only work if they don't produce more heat doing that, than they pump away.

Same issue as with anti-Stokes scattering that was once held promise of "self-cooling LEDs" or using gas dynamic lasers for spaceship cooling.

There is no trick against thermodynamics after all.


An LED converts current into light. Reverse biasing it causes it to convert light into electricity (i.e it acts as a photodetector).


So how does it work with the law of conservation of energy? [1]

[1]: https://en.wikipedia.org/wiki/Conservation_of_energy


Like you'd expect? No one is making claims of free energy. It's like putting a stirling engine on the side of an internal combustion engine. You have a chance to grab some of the waste heat, why not use it? Specifically, capturing that waste heat will boost efficiency by some small %, if the additional systems total construction cost is less than the efficiency given over the lifetime of the system, why not?


It's just converting between two forms of energy: total energy is still conserved. You might be thinking about the second law of thermodynamics, which limits how much can be converted.


Presumably the electricity would be used to provide some of the energy requirements of the computer - in effect, the tower would produce less "waste" heat


This is an interesting effect.

How does the strength and efficiency of this effect compare to the cooling possible with Peltier junctions?


You can move a watt per cm^2 with a single-stage Peltier effect device, depending on the temperature difference you need. Here's a data sheet for a 9mm by 9mm device that can do 1W/cm^2 at 40C temp difference:

https://cdn2.hubspot.net/hubfs/547732/Data_Sheets/NL1012T.pd...

The OP claims 1000 W/m^2 theoretical max for the optical cooler. That's 0.1W/cm^2. Right now they say it's 0.6% of that theoretical max.


"Near-field photonic cooling through control of the chemical potential of photons" (2019) https://www.nature.com/articles/s41586-019-0918-8


This sounds like a previously known effect known as “negative luminescence“: https://en.m.wikipedia.org/wiki/Negative_luminescence


Cool! It's like an infrared light vacuum!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: