> Researchers at Rice University developed a method to convert heat into light that could boost solar efficiency from 22% to 80%
The conditional tense signals that the researchers didn't actually do so, but that the study might enable it. The article re-iterates that this is speculation:
> The implications of their discovery are significant. Research from Chloe Doiron, a Rice graduate student, revealed that 20% of industrial energy consumption is wasted through heat. It could also mean an increase in the efficiency of solar cells, which are currently only 22% efficient at their peak. Recycling the thermal energy from solar cells using carbon nanotube technology could increase the efficiency to 80% according to the researchers. ...
I can see this being somewhat useful as a no-moving-parts heat engine for something like a solar concentrator, but there’s another relevant thermodynamic limit: even if this magic material has emissivity 1, it won’t radiate at a greater power per unit area than the blackbody spectrum predicts. At non-crazy temperatures, this is not very high, which will limit output for small things like solar concentrator targets.
So I can see this being useful to convert waste industrial heat, or maybe as a bottoming engine for a combined cycle plant, but I am having trouble understanding how it could be useful for solar.
The devices as I am familiar are often called thermophotovoltaic cells: https://en.wikipedia.org/wiki/Thermophotovoltaic
See eg http://xlab.me.berkeley.edu/pdf/259.pdf for a great overview, esp section 3.3.
I saw this video recently that I found accessible from an undergrad physics background and got me interested: https://www.youtube.com/watch?v=XnVVyTD7CzM
One way or another, once you've converted sunlight to heat, you are limited by the Carnot efficiency. For the 80% efficiency they claim, if all of it comes from thermophotovoltaics, they need a hot side temperature at least 5x ambient, which is over 1000 C. I wish them luck getting anything resembling a solar panel up to 1000 C. (I'm not, in any respect, saying it's impossible -- I'm saying it's very hard. You'd need excellect spectrally or directionally specific absorption to avoid re-radiating all that heat out the top of your panel, and you'd need conventional transparent insulation to stop conduction.)
On top of that, super-Plankian emission or no, if it's limited to the near field, then the PV cell is very, very close to the hot surface. That PV cell needs to be kept near room temperature to get that efficiency.
This whole thing seems extraordinary complex for something that wants to be cost-effective.
I understand your point that it can't just be driven by a secondary IR emission from a warmed-up solar panel; that's not hot enough to be that useful.
In a more detailed article, there is talk about the carbon nanotube device being useful because it can withstand high temperatures:
They of course understand that they need a big temperature differential.
Just go to the source: https://news.rice.edu/2019/07/12/rice-device-channels-heat-i...
As a rule, everything around you glows – in other words, it emits photons. Some infrared ones, some visible ones, some ultraviolet (or even higher-frequency) ones. How much of each depends on the temperature of the material. Most things (e.g. you) are too cold to be visibly glowing, but they still emit plenty of infrared radiation.
Now, standard solar cells convert visible photons into electricity. Any incoming infrared photons, however, aren't strong enough to kick any electrons loose, so they go to waste. Here, the researchers used carbon nanotubes to absorb the wasted infrared photons. This heats up the nanotubes, which then emit visible photons, which then increase electricity generation.
There is no way to convert heat to electricity directly, unfortunately. However, you can get small amounts of electricity out of temperature differences: https://en.wikipedia.org/wiki/Thermoelectric_effect
Well, if you have enough heat, you can boil water and then use the steam to turn a turbine... I suppose in some sense this is dependent on a temperature difference, but I wouldn't call the amount of electricity we get by this method 'small.'
It is absolutely dependent on a temperature difference, because it relies on cooler water that isn't already flashing into steam.
Once a solar cell is built with this technology, do you think there will be enough light emitted by the nanotubes to make the solar panels appear to glow, themselves? Because that would be pretty interesting.
> Because electrons in nanotubes can only travel in one direction, the aligned films are metallic in that direction while insulating in the perpendicular direction, an effect Naik called hyperbolic dispersion. Thermal photons can strike the film from any direction, but can only leave via one.
So, from a pragmatic perspective, I would say that no, they wouldn't glow - proper engineering would ensure these particles are collected.
I'm a bit skeptical that that would work. Otherwise you could build a perpetual motion machine by having the nanotubes in a hot room and running your perpetual motor off a solar cell powered by the light.
My guess is when the article says emit light they mean emit infra red light / radiation rather than visible. The article doesn't really make sense to me - I imagine someone got confused by the physics rather than it being an actual breakthrough in energy generation. I note the article says "propose a way to build" rather than having actually built something that works.
Critically these photons are not heating up the material. It’s like saying photons reflecting off a mirror heat it. No, the photons that are not reflected do heat the mirror, but not the reflected ones. (At least mostly.)
Light with a short enough wavelength (UV-B, for example) has enough energy to damage the DNA in our skin and increase our risk of skin cancer.
I don't know why they're calling that "light". It's not clear to me if it means visible light. The paper seems to imply that it peaks in the near-infrared. They appear to be using "light" in the general sense of "photons"... but it's all photons.
They're talking about very high temperatures, the kind that emit significant visible light as well.
The exact details depend on the chemical composition of the material, but for a simple model, check out https://en.wikipedia.org/wiki/Black-body_radiation
All physical objects emit electromagnetic radiation corresponding to their temperature (as opposed to reflected, reaction-induced, or stimulated radiation, more below), with a corresponding blackbody radiation curve. Note that blackbody emissions don't have a specific frequency (say, unlike laser light), though the curve has a peak emission.
Traditional incandescent light bulbs are blackbody emitters, as is a hot coal, the Sun and other stars, or the glowing elements of an electric radiant heater. Human eyes see blackbody radiation, in the visible range, as having a characteristic colour, which paradoxically gets bluer as the temperature gets higher, so the ruddy tones of low-voltage incandescent lamps are low colour temperature, and the intense white of halogen lamps are high colour temperatures. Blackbody radiation and colour temperatures are given in Kelvin, with typical visible light corresponding roughly to ~3,000 - 8,000K. You might also recognise these values from gamma or colour correction values on monitors and video equipment.
(Colour temperatures were used to judge processing for ceramics and metal processing / firing / smelting processes as well. They're also used to classify stellar temperatures, and in conjunction with brightness and/or distance estimates can be used to find stellar sizes.)
Infrared light is characteristic of peak blackbody emissions of bodies at or near "room temperature". So IR is not heat, but is associated with hot objects. Heat itself is ... thermal energy, which is its own complex thing.
Since all EMR carries energy, all EMR heats objects in which it is absorbed. A sufficiently intense emission above the IR range will also heat an object. However hot objects in near proximity transfer a great deal of thermal energy directly as IR emissions. The illuminating capability of "white" light (typical solar emissions) allows a small amout of EMR energy to provide a high degree of visual information without imparting much heat on the illuminated objects (though it does impart some). Consider that even an (inefficient) 100W incandescent bulb, whilst hot to the touch, does not significantly heat the objects it is illuminating, and a far more efficient equivalent LED lamp, drawing about 15W of electrical power, accomplishes the same illuminating power with vastly less heat. We don't have to toast things to be able to see them, just bounce a small amount of (high-energy) visible light off of them.
Not all light (or more accurately, EMR) is blackbody radiation. There can be chemically-emitted light, directly from chemical reactions. The blue glow of methane (natural gas) combustion is a chemical emission, contrasted with the yellow-white glow of a candle, actually blackbody emissions of suspended soot particles in the smoke plume. (There's a blue region of chemical emission at the base of the flame.)
Chemical emissions are based on specific frequencies of EMR emitted as electrons transition between energy states, if I understand correctly are a quantum phenomenon. Fluourescent and LED lamps also work based on chemical / valance emissions, and operate in far narrower bands than blackbody emissions -- one of the reasons these lamps can appear harsher than incandescents. Similarly various chemical lamps, especially high- or low-pressure sodium vapour, formerly popular as street lighting, which emit in narrow bands (low-pressure especially).
Some forms of ionizing radiation are also EMR emissions, at far higher energy levels, triggered by nuclear rather than electron emissions. Typically gamma-rays.
And stimulated emissions such as lasers and masers are ... another phenomenon I understand only poorly, but are also tuned to very tight frequencies. Radio and microwave emitters are somewhat similar.
Sounds like 80% is a theoretical prediction for this method, not an experimental result:
Naik said adding the emitters to standard solar cells could boost their efficiency from the current peak of about 22%. “By squeezing all the wasted thermal energy into a small spectral region, we can turn it into electricity very efficiently,” he said. “The theoretical prediction is that we can get 80% efficiency.”
Some projects have really big ROI
Research and speculation is awesome stuff, but the hard part isn't discovering new concepts. It is actually making those concepts into a marketable and affordable good.
The evidence for this is that even though there is essentially a new 'breakthrough' discovery every other week for solar panels, or electric motors, or batteries.. We are still using what amounts to cutting-edge tech from the late 90's.
In the modern era this means we generally have to wait till the patents expire and market competition kicks in in order to get the price low enough and the product perfected enough to see widespread usage. If it goes anywhere at all.
It's also worth noting that Florey, the man who is largely responsible in making penicillin practical drug, refused to patent his early innovations to make it widespread as possible.
Having made that first breakthrough, world class teams across the globe fought to bring the efficiency up from "froth on the top of a brew" to "gallons of the stuff"
Florey was a big part of the story but so were teams in US and Europe and then the US military scaled it up beyond belief.
We spent money, targeted money, on the best teams globally and then put serious industrial might to it once they found the answers.
That exact approach is what I am calling for again.
And as for patents - if enough global effort is put in, with enough government funds, the pressure to put the results "in public hands" rather than hold out for patents is really strong
If this is the case, then wouldn't "buy a bunch of patents and (with great fanfare) make them open-source" be a relatively low-complexity way for a billionaire who feels like making a name for himself to accelerate progress on fighting climate change?
And that broad category is ~25% of costs.
From other comments, I am guessing this result doesn't give anything like that but one should keep in mind greater efficiency certainly could give great payoffs.
Different technologies with different pros/cons could be best for different markets.
But I think it's important to point out that even though people often disparage solar photovoltaic for having low "efficiency," that metric is not an impediment to its broad deployment or great utility to us.
The cost is what makes the decision for deployment, not the efficiency. Perhaps this efficiency makes 2-axis tracking economical enough to justify, and then it gets deployed, but in the end the efficiency wasn't as important as the improved costs.
There are vast chunks of empty land in AZ, NV, NM, TX, CA that could easily fit enough solar panels (each) to power all of the US. The issue is cost of panels and no base load supply.
Fixed size applications like rooftop would see a multiplier.
It's not actually doing that though. It's changing one set of light-frequencies into another.
This is why you see some companies applying eco friendly tech to some new buildings (Apple) but not in general commercial development.
Again this is just what I’ve surmised from reading threads like this. YMMV
The problem is when an energy source with an high externality (climate related or otherwise) isn’t priced in. This isn’t a hard problem for the government’s perspective though: just price in the externality through some free market system. Carbon credits are a good example of this.
But right now most of this stuff is just in a lab. Frankly the tall cooling towers you see on some buildings weren’t in widespread use until at least the 50’s even though they were invented at the turn of the century.
> channel mid-infrared radiation (heat energy) into light energy
> absorbs thermal photons and emits light.
Goddamnit, no! None of this is "heat", it's just one range of light being converted into another! This is shitty "science"-journalism parroting a common misconception.
After all, the only reason we associate infrared with "heat" in the first place is that it's useful for detecting things which happen to be at a range of temperatures, temperatures which just happen to be slightly warmer than the operating state of self-reproducing bags-of-mostly-water on a small rocky planet.
I googled optical rectenna.
"An optical rectenna—a device that directly converts free-propagating electromagnetic waves at optical frequencies to direct current" 
Meanwhile silicon and CdTe solar cells continue to decrease in price and increase in efficiency year after year. At this point the current technology is cheap enough to be profitably used to supply power in many regions of the world. Any one who says we need a break through in solar to make it economic is wrong. At this point the solar panels themselves are less than half the cost of a solar installation.
Nearly all press releases talking about efficiencies over 50% are just talking about unpractical theoretical limits that would require cooling the solar cells to freezing temperatures or other schemes (i.e. talking about the carnot limit instead of the shockleyy-quiesser limit).
Blame journalists for hyping this because of the climate change focus.
I am involved in actively looking for better materials and better sources of energy. The reality is that there are no fundamental breakthroughs since oil and nuclear.
No matter what you hear we just haven't had anything that fundamentally changes the game.
So instead of actual breakthroughs in the fundamentals, we get marketing, branding, and communication. But this is not something that can be fixed with that, it's physics, not product innovation
We will most likely get fusion (far out) fuel cells (even further out) before we get any fundamental breakthroughs here.
Automobile manufacturers have produced small numbers of fuel cell vehicles since the turn of the millennium:
I'm not making claims about the actual merits of fuel cells, mind you. But claiming they're further out than fusion is bizarre. Some people are driving them around today. Nobody is generating net power from fusion today.
I am talking about the kind of fuel cells which can be used by wind and solar to store energy and distributed economically feasible.
We are not even close to that. If you know of anything, by all means, please let me know as the group I am part of would invest in it in a heart beat.
I don't know of anything like that either. Small fuel cells rely on expensive precious metal catalysts and large ones have too much capacity to be a good match for the distributed generation sector.
A technology that is commercially available but too expensive at present for large scale adoption, like fuel cells, is more mature than a technology that hasn't been demonstrated even in a cost-is-no-object context. That's why it's strange to hear fuel cells described as even further out than fusion power.
Solar and wind is less than 1% of world energy consumption and is not expected to be more than 3-4% in 2040.
Energy is not a product/market problem it's a physics problem and we haven't had any major breakthroughs in that space for a long long time.
Incremental improvement and economic forces do matter. Solar is now cheaper than coal in many climates, and we are seeing the effects.
Of course, there are still many problems to be solved. But we're also working on incrementally on solutions to those. For example, battery cost is falling precipitously. "grid scale battery" was not a thing a decade ago, now you see installations popping up regularly.
I'd like to see cold fusion as much as the next guy, but dismissing incremental progress is foolish.
The state produces by far the most electricity of any state, too. It's almost twice as much as Florida.
Unfortunately it's also a huge petroleum and natural gas producer. About half of the energy used in the state is for industrial use, much of it for the oil and gas industries. On the bright side, natural gas is displacing coal in the meantime while solar and wind are not replacing it fast enough. It needs to get better, but things could be much worse.
Electricity is only ~20% of the entire energy usage as far as I remember.
That doesn't mean it will work at scale for society which is the primary issue.
The key thing to look for is energy density as that will be more likely to give society a bang for the buck.
Distributing solar cells out with the capacity factor of solar isn't economically feasible for society and neither possible as a main source of energy.
Nuclear and Hydro to some extent Thermal none of them are in vogue.
Look at the capacity factor of wind and solar, add to that the huge areas they need and the fact that they are intermittent and you start to get a glimpse of the problem.
This is neither economically nor technically feasible. You can't generate as much as you think and you can't do it at a cost that makes it feasible generally for society not even with lowered cost.
Currently, we are talking 1% of world energy consumption not expected to be much more than 3-4% in 2040 and that's despite huge investments and all the political goodwill you can ask for  Keep in mind that the numbers you normally see displayed is for electricity not for energy. Electricity is only a subset of energy.
Furthermore, investments in solar and wind is decreasing especially when you take china out of the equation. 
And again. Lab results or theoretically possible advantages most of the times aren't feasible in reality and at scale.
Aren't you still connected to the traditional grid anyway?
A giant leap in solar efficiency doesn't solve the storage problem, and the storage problem doesn't need a magic physics solution, just large scale heads down engineering. My point is: solar is efficient enough now, and a sound economic choice. I make enough power in a year just with my roof for my house and car. I do need a storage solution to do power leveling, which is what I use the grid for now.
When you factor in the infrastructure to support you when you don't have power yourself then it suddenly looks less economically sound.
That's the point.
Modest estimate would be 200 000 over the last 30-40 years.
Some of it must relate to energy.
There are plenty of patents on energy just not many fundamental breakthroughs.
Vantablack is based on carbon nanotubes and you can buy it now:
No. One cannot use light as electricity. One can use light to make electricity. This is not just a minor typo; it's flat-out wrong.
More info: http://amasci.com/miscon/elect.html
What is the physical unit of 'electricity'? There's not one, because it's not a thing.