The linked article and the title are pretty misleading. But if you go and look at the abstract for the paper, they mention that the extra energy comes from pumping heat out of the environment.
Something is missing here. You can not generate energy just from heat you must have both a heat source and a cold sink. It's en extremely fundamental thermodynamic law.
Where is the cold source in this experiment? (I'm not saying there isn't one, just that the article makes no mention of it which is a huge omission.)
Energy is certainly not being generated. The energy of the light output will be equal to the heat absorbed plus the electric energy input.
If you want to think of it in terms of temperatures, I'm a little fuzzy on the details, but I think in a sense empty space has a temperature associated with it based on the radiation that is present (similarly to how black body radiation is a function of temperature).
I guess if the LED was heated more than the room where you receive the light it would work.
So you have to put this into an oven for it to work - if you just left it in a room, and expect it to also put light into that room (i.e. the device and the room are the same temperature) it would not work.
Yeah, if we could cool some place without heating another by more than the amount of heat we cooled this place - it would breake 2nd law of thermodynamics - we could then use difference of temperatures to extract energy and we'll be in the same place, but with more energy. We could then repeat such cycle as many times as we want == infinite free energy.
But there was no gradient of temperatures before, and there is now - it's possible to use this gradient to produce energy. So this energy had to come from somewhere. It didn't come from electricity, because LED already has >100% efficiency (so there's more energy in the photons, than in electricity we put into the system).
So it has to come from temperature differnce between the place we cooled, and the place we heat up by photons, I think. So this place had to be cooler than the LED before experiment, so gradient of temperatures was there from the start, and we are only decreasing it.
Conservation of energy is not the issue. It's not enough just to have energy in = energy out. In order to turn heat into any other kind of energy you MUST have a cold sink.
If you want to have black body radiation it's not enough to just have a dark area - the recipient of the light must be colder than the source.
So this would only work if the LED was heated more than the room around it. I guess if they put it in an oven, and then viewed it from a window in the colder room.
> In order to turn heat into any other kind of energy you MUST have a cold sink.
No, in small enough scales this is not necessary. This is not "heat moving from a hotter object to a colder object and doing work on the way", this is "heat being directly converted to photons". When you view a patch of hot gas as individual molecules bouncing about you can transform the heat of a molecule into momentum of your target at 100% efficiency (heat is kinetic energy!). In quantum environments, the second and third law do not work like they do in the macroscopic world. The second and third are still not broken in the large because to do this kind of trickery you need to have a lot of exact information, and that has a cost in entropy. (So you'd need a maxwell's demon to scale this up.)
You can only convert the kinetic heat motion of a single molecule to kinetic energy in your target if your target is colder than the source! And there's your cold sink.
The efficiency is not 100% because the target sends energy back to the source since the target is moving (from heat kinetic energy).
If the target was standing still efficiency would be 100% - but that's the same as saying the target is at absolute zero and we already know that Carnot efficiency is 100% if the sink is at absolute zero, so it makes no difference that you are dealing with a single molecule.
To your second point that this is "heat being directly converted to photons" - that's exactly the definition of blackbody radiation. But the blackbody also absorbs radiation from the environment it is in, so it's not perfectly efficient either.
It's simply not a full system analysis, > unity efficiency WRT input electrical energy. There's just a corresponding draw of energy from the environment that is also producing photons.
>"You can only convert the kinetic heat motion of a single molecule to kinetic energy in your target if your target is colder than the source! And there's your cold sink." //
This is counter logical. You're saying that a small fast moving body can't impart energy to a large slow moving body. Can you explain further how this works at the single molecule level?
> There's just a corresponding draw of energy from the environment that is also producing photons.
It is only possible to do that if the photons are sent somewhere that is colder than the device.
> You're saying that a small fast moving body can't impart energy to a large slow moving body.
No, I'm saying it can't impart all of its energy, and the more energy the other body has the less energy it can impart. i.e. it's exactly Carnot efficiency.
Perhaps what you don't realize is that Carnot efficiency doesn't represent lost energy. It represents heat energy than can not be converted to another form, and remains in the source.
There needs to be a cold sink since heat is just a variation on kinetic energy (small vibrations), and momentum needs to be conserved. The released photons have momentum too, and it looks like they get that from the heat (=vibrations) in the LED.
It's not about conservation of energy, it's about entropy. If you have a device that can absorb heat from the environment and convert it into light without any form of cold sink, you're reducing entropy.
Though most inhabited spaces aren't strictly temperature regulated, so there is still some potential for energy subsidization.
There might be some applications in solid-state heat pump technology as well. Actually I would think that the more interesting possibility, given that the field is still maturing.
Most cooling systems are seem pretty complex/bulky/energy-inefficient to an ignorant sob like me, but if they can make it simple enough to fit into an LED, and the fact that this cooling system relies only on optical output and electricity, it could have a lot of cool applications we haven't thought of.
Cooling systems don't have to be complex - lookup peltier cooler.
But they are energy inefficient because the laws of thermodynamics requires it. (You can make them more efficient by not cooling as low - trading temperature for volume. In an A/C that would mean instead of a 40 degree drop, do a 20 degree drop, but with twice the air flow. The net result for the living space is the same, but the efficiency is much improved.)
> Although scientifically intriguing, the results won’t immediately result in ultra-efficient commercial LEDs since the demonstration works only for LEDs with very low input power that produce very small amounts of light.
This reminded me of an Asimov short story about an "experimenter" (engineer?) vs a pure scientist. In the story, a ship was trying to enter Jupiter (IIRC) but needed a "force field". The scientist created a force field... which had one problem: It could only be turned on a very small amount of time before it became unstable (and I think exploded).
The scientist then spend a lot of his time trying to come up with a better force field which could remain active for more time, but he stopped when he realized it was theoretically impossible (and hence, it was impossible to land in Jupiter). Meanwhile, the "experimenter" achieved the impossible after using the already created field, by turning it on-and-of very quickly and thus, preventin it from destabilizing!.
What does it have to do with this? Well, what if we had hundreds of those very dim leds? that might provide usable light.
Looking at the figure attached to the article, you can see that the only > 100% efficient trial was with ambient 135 degree-C temperature (two other trials at 84 degree-C and 25 degree-C were much less efficient). The extra energy comes from the environment.
Still sounds like a promising technology, considering the fact that a lot of electronics operate in warm environments such as the inside of a computer. If we could harness a portion of that excess heat and do something useful with it, that would be pretty cool (no pun intended). Think of a mobile device whose screen is partially powered by excess heat from the processor, or a datacenter where some interior lighting is powered by excess heat from the server racks.
I'm not sure if this would work the same way, but I know that some high voltage LEDs are actually just strings of multiple LEDs in series [1]. Though it clearly wouldn't reach the inflated efficiency in this article, this could lead to a chip with a large number of very very small LEDs reaching a higher efficiency than a standard LED. It obviously depends on the manufacturing process though.
Wow: 69 whole picowatts of light if the ambient temperature is about 200F. And fundamentals of the physics mean that neither of those figures is likely to change.
If the claim were more spectacular, we'd call this snake oil.
Their "efficiency" is light output vs electrical input. The result is actually that it starts converting heat as well as electricity into light. So you get more energy in light out than you put electricity in. You just put more energy in than electricity.
Vaguely like a heat pump - electrical energy stimulates conversion of thermal energy to light. So instead of using electricity to produce light and heat, LED uses electricity and heat to produce light.
well, they didn't take into account the energy produced by the ambient temperature (the exact temperature was not mentioned in the article, and the paper is behind a paywall, but they do say it was rather high)
> [it] instead took advantage of small amounts of excess heat to emit more power than consumed. This heat arises from vibrations in the device’s atomic lattice, which occur due to entropy.
They turned heat into light using electricity. That may not be what people were thinking when reading the title/article, but I find this very cool, and I did not previously know it was possible.
So could you just have a microarray of millions of these to get a practical amount of light at an insanely-low electricity cost? (And given jnhnum1's clarification, also be cooling the room slightly?)
I may be way off, but I don't think that's how multipliers work. That is, even if you had a bunch of them, they should still have the same aggregate efficiency, no?
My thought wasn't about changing the efficiency, but making it useful. The article mentioned the effect was only evident at very low power levels, creating very small amounts of light. Light, I believe, does 'add up' so millions of such devices might provide a useful amount of light while still having an attractive efficiency. (At least, inside an oven-like environment.)