It leads to roughly a twenty percent boost in theoretical max efficiency. If that can be translated to a twenty percent actual improvement, that's huge.
 https://www.utilitydive.com/news/los-angeles-solicits-record... (Los Angeles solicits record solar + storage deal at 1.997/1.3-cents kWh)
That price is under optimal conditions. They built in the desert where there are clear skies, lots of sunlight and cheaper land. Plus whatever kind of tax incentives or other subsidies they get in California that conceal part of the cost and may not be available everywhere.
The real question is how far can the price continue to fall before it hits a floor. And what happens to the other generation methods as a result of the competition. If people build a bunch of solar farms and storage and replace natural gas plants with them, then there is less demand for natural gas, so the price falls and the remaining natural gas plants stay competitive due to the reduced fuel cost.
The "solar takes over everything" narrative is possible, but there is no guarantee of it, and it's probably best not to take it as a hard assumption and make no contingencies for if it doesn't actually work out that way.
And yeah, the massive subsidies they've received by not having to pay for polluting the air, as well as being major contributors to climate change.
Very true, but also kind of the point. Those subsidies still exist -- we still have to do work to get rid of them, and in the meantime replacing fossil fuels with non-carbon sources is at best slower and at worst plateaus entirely.
Moreover, this is the same charge leveled at nuclear. Opponents claim that it's more expensive without exploring why it's more expensive -- and a big part of the reason is because we don't price carbon and then subsidize fossil fuels on top of that. Fix that and all the non-carbon sources become more competitive.
But we can't just sensibly conclude that's what needs to be done and then pack up and go home. We've got to actually make it happen first.
This this this
But it is not going to fall below the cost of production, at least not over the long term.
Is the power producer in this case receiving subsidies not included in this price, or is 1.997¢ the unsubsidized price?
I wonder if the efficiencies will eventually trickle down, or whether the form-factor, other components, retail sales channel and small market for solar phone chargers will prevent this.
· 7Wp of solar panels at this point would cost about US$1.50 if we were talking about the low-cost power modules used in utility-scale deployments. This is, I think, less than the (retail) cost of the case, sales transaction, or power electronics.
· Low-cost power modules are less efficient than higher-cost modules, which means you'd need a larger surface area to get 7 watts.
Which chargers are you looking at? Searching on Amazon I find a lot of USB battery packs with built-in solar panels, which I can't help but think are basically fake, since a 16%-efficient panel that size would provide less than a watt. (I've probably been influenced by watching a lot of bigclivedotcom videos where he dismantles phony consumer electronics, including USB battery packs that have a tiny fraction of their claimed capacity.)
I'm thinking of no-battery panels, like this camping-gear-priced one: 7W for $150 (AUD) https://goalzero.com.au/shop/solar-panels/nomad-7-plus/
eBay prices are $15-25, much cheaper than last time - maybe it's trickling down already? https://www.ebay.com.au/p/10W%2D5V%2DPortable%2DSolar%2DPowe...
"solar usb charger" may be better search terms: https://www.amazon.com/s?k=solar+usb+charger
Camping gear, like cycling gear, is ironically overpriced for an essentially free activity.
Can you give an example? If one goes to Digikey or Mouser, the price of most standard electronic components decrease when the quantity increases.
I guess the phenomenon you described only occurs if special or proprietary components is involved, so that the company is willing to sell samples at a loss in exchange for more consumers.
My guess is that the OP was referring to some non-replaceable proprietary stuff with higher-level of integration, like some kind of modules, devboards or FPGA or something. I don't know.
I just updated the comment to ask for an example.
The Amigans ran into this when they had the idea that their next generation desktop computer should have the PPC chip people thought Apple was going to put in laptops right before it went x86-only - the PWRficient PA6T-1682M. Getting a few of them for prototyping was easy but there was never going to be a long term supply at a reasonable price. Nobody had a plant manufacturing this CPU since 2008, but the Amigans started needing them in 2010 - so they were soon resorting to playing scavenger hunt "Hey, did you have a box of these anywhere? Can you check? Yes we'll pay the original per-unit price".
Internally they were reassuring each other that hey, these devices are inside US military hardware, Apple assured the US DoD that it would be able to supply replacements. But the ability to send the DoD a few spare CPUs out of a crate you've kept in one corner of a warehouse is not the same thing as continued commercial availability.
One really needs to ensure there will be supply and attainable bulk pricing when building a niche device like this. I understand the desire to use some esoteric part like the PA6T, and with the expectation that it was going into a mass market laptop, that it would be available.
Is there an effort to get Amiga running on RISCV? Custom silicon is also within reach starting at about 1k$ Machines are so damn fast now, you could run the whole thing in emulation, in Wasm inside of Firefox.
There are several businesses (notionally for profit) fighting over this mess, in court, lying to each other and their fans in public, all that jazz. Some of them claim to own the name "Amiga" in various forms.
The device with a PA6T in it was the AmigaOne X1000, which eventually ran AmigaOS 4.1 update 6 or some similar string of nonsense. So that's a 32-bit, uniprocessor PowerPC operating system, kinda-sorta compatible with the Motorola 68000 series Amiga Workbench versions (aka AmigaOS 3.x)
There are also a bunch of Amigans running an OS called "MorphOS" that is also a 32-bit, uniprocessor PowerPC operating system, from a different outfit, mostly on old PPC Apple desktops or laptops.
Then there's a bunch of Amigans running souped-up Commodore/ Motorola era Amigas with patched up "Amiga OS 3.x" versions, sometimes with PowerPC "accelerators". This is a more ordinary nerd type culture, a lot in common with people who own and operate actual steamrollers, or build their own space rockets as a hobby. Most of them are conscious this is all it is, a weird niche hobby.
And then there are groups building new hardware, often with FPGAs but sometimes other weird ingredients. It seems as though this group might built the thing lots of ordinary people want - which is a box you plug into a TV and then play games you remember as a kid. But much more than for Nintendo when they did that, the Amiga problem is mostly not about technology, it's about _Rights Management_. Those games were produced by a wide variety of different companies many of which no longer exist or are now part of huge console companies that won't let your rival product exist. In some cases the games were licensed, so the license expired and you'd need to get OKs from two or more organisations that are now at exactly cross purposes. Imagine trying to get Sony, Microsoft and Nintendo to all sign off on you selling a product that competes with their consoles. Why would _any_ of them do that, much less all of them? And if you can't secure the rights your product will be much worse than the illegal products available already.
There's probably dozens of spots around the UK suitable for Dinorwig 10 GWh pumped storage style solutions. NIMBYs permitting.
If the UK fully built out offshore wind it would probably have enough to power the whole of Europe. Same goes for Ireland.
Building the infrastructure to transmit that power would be a massive undertaking though. Ireland and France are currently cooperating to build an undersea power cable which is going to cost $1Bn, take 5 years, run 575km and have an annual transmission capacity of 6.1 TWh.
Annual global energy usage in 2013 was 19,504 TWh, or 3,197 times that.
Warning this book is quite old now. While the discussion of theoretical limits should probably still hold up, all the economics has been turned upside down since then.
Solar and wind costs dropped faster and further than most people's best case estimates, while nuclear costs and timescales just keep going up.
So they overestimate area needed.
The end product of those efficiency is still Twh. I don't see how the Methodology is flawed.
Do you need to replace 46MJ energy embodied in gasoline with 46MJ worth of electricity?
No. Electric engine has 95% efficiency so you need 9.68 MJ of energy.
They also suffer from mechanical and thermal losses due to their small size. Small size means higher internal surface area to volume and higher operating speeds. Contributes to higher friction and thermal losses. Friction in an ICE is actually significant.
Similar improvements can be had for heating. You get a little less than 1J of heat in your room per Joule-equivalent of fuel you burn, but with an electric heatpump you get 3-5J of heat per Joule of electric energy.
For practical purposes, ICE has way lower efficiency than 40% and electrics lose some due to the transmission and battery recharge.
And just like actual heat engines, solar cells are improving. The best cells have achieved 47% efficiency in the lab, comparable to a high performance combined cycle natural gas plant (note that thermal power plants often cheat a bit by listing the efficiency in terms of the low heating value of a fuel, ie the heat produced not counting condensing the water vapor out).
Either way I'm actually rather confident that the whole storage thing is more or less solvable. Since there are a host of technologies that are workable.
So it's slightly pointless argue as some do that lack of current solutions is a reason not to move forward with renewables. Worst comes to worst we already have peaking plants that could substitute for baseload power.
> lack of current solutions is a reason not to move forward with renewables
No one is saying we shouldn't move forward with renewables. Well maybe a certain canine news network, but not anyone here.
> Either way I'm actually rather confident that the whole storage thing is more or less solvable. Since there are a host of technologies that are workable.
What people are saying is there isn't good enough technology yet. So that means that we need to continue funding research. Not being solved means that just building won't be the solution. That we need to spend some serious money researching. That's what people mean by "battery tech isn't there yet". It isn't a defeatist attitude, it is a "continuing and doubling down on research is an important part of the equation" type of attitude. This is typically because that kind of research isn't done by the private sector, but rather government. DOE does a lot of this funding, and even funds a lot of the research being done in the private sector. So being vocal about this means we care and hopefully those in control of allocating that money (probably not by Rick Perry) will spend more on fixing that part of the equation while the private sector does the building out part.
The parent of this comment is talking about increasing the efficiency (reducing rejected energy) and the gp is talking about just increasing the total amount of energy. Both are solutions. But I wouldn't say it is a good thing we reject more than half the energy we create.
The meaning of "Rejected Energy" in that graphic is obscure but it certainly doesn't relate to energy which has been created and rejected. It may relate to excess capacity of fossil fuel plants (how much more they could produce if ordered to burn more fuel).
The US is thankfully not "rejecting" twice as much energy as it is producing as those figures of 68.5 vs 32.7 lead you to believe. Curtailment of wind and solar generation in the US is recently around 2-4%
What has inherent waste to do with your reference to "Rejected Energy" ? What do you now understand of that graphic ? Transmission losses are generally in the range 5-10% for modern grids. Your interpretation of that ambiguous reference wrongly indicated 66% losses. Thats not reasonable and doesn't support your advice on energy storage. I suggest gaining more familiarity with the general subject before judging it.
But yeah you can act all knowledgeable, or you can listen to the group leader of the LLNL staff that put this chart together using the same exact example I used.
I, again, will say that this is only a portion of the rejected energy. But thermodynamic loss (i.e. heating wires, heating your ICE engine, etc) is part of that rejected energy.
So then "rejected energy/waste heat" can not be used or as we are saying here -"accepted" by better battery storage, which was the opinion that you brought this ambiguous chart and presentation to explain. This chart is irrelevant to battery storage.
Also, most of the energy in the chart is thermo-electric derived from fossil fuel and nuclear plant. Wind turbines and solar "reject" heat energy in completely different manner and lesser degree than thermo-electric plants - it is truly nonsensical to mix the two together under the vague heading of "rejected". It makes so little sense, I do wonder about the credentials or motivation of this LLNL to educate on energy issues.
Matching demand to supply is not that hard. If you have oversupply you run pumped hydro, batteries, and power-to-gas plants and store the extra energy. If you have even worse oversupply you disconnect your solar panels, or your wind turbines. If you have undersupply you use the energy you stored previously.
The problem is 100% solvable, but that doesn't mean it is currently solved or what's left is trivial. We're almost there, but it is hard to finish fixing things when you slap a sticker on it saying "done" and stop addressing the remaining issues.
could be totally off base, or very much a “not yet industrialised” process... but the idea is super cool none the less
 the percentage sounds high enough that I assume it includes all gardens and roads not just the buildings themselves
It shows current UK grid demand and generation sources as well as historical data.
Take a look at this chart . As you can see, renewables produce barely any primary energy at all with the exception of biomass, because they don't need to.  If we really did increase our renewable deployment 8 fold then Germany would be at 400% renewables at current consumption rates which clearly isn't necessary even if every single car is electric and all industry switched away from fossil fuels. There will still be some processes that depend on fossil fuels as input material but those do not burn the fuels to obtain their energy. Oil products like plastic will most likely dumped in the oce- I mean in landfills.
This is analogous to an article about a chemist discovering a way to get more chemical efficiency out of gasoline and someone saying that we can already do that with a richer air-fuel mix using a turbocharger. The two are not comparable.
Why? Because just yesterday there was an article on the front page touting how Toyota was testing putting improved solar panels on the roofs of their vehicles, and that article quoted an efficiency of 34%, well above the "max theoretical limit" given in the present article: https://news.ycombinator.com/item?id=20364348.
So I guess my question is how would the proposed manufacturing process in this article be an improvement? Cheaper manufacturing costs? Even higher efficiency if you had "multi-junction cells" made using the processed discussed?
> well above the "max theoretical limit" given in the present article
Ok, so how would you improve these articles? Do you want the news articles to draw upon a distinction between 'maximum theoretical limit of the substrate' (or whatever it's called), and the 'maximum theoretical limit of the resulting panels'?
Like, I understand and support your point here, but I'm wondering at what point do you accept that nothing is truly fully-accessible to a layman, that you need to have at least some surface knowledge to understand anything, and that it is likely impossible to eliminate this form of confusion?
Just look at that opening sentence, it's really bad if the aim is to inform readers. It's great if it's intended to be self-aggrandizing marketing without consideration of how deceptive it can be to readers.
Any expert can trick uninformed readers into drawing misleading conclusions. If you want to be trusted as an expert, it's necessary to avoid doing that.
MIT is a pretty renowned brand it its own right - I think it's fair to demand better. This doesn't come across as trustworthy to me, anyhow.
> Other approaches to improving the efficiency of solar cells tend to involve adding another kind of cell, such as a perovskite layer, over the silicon. Baldo says “they’re building one cell on top of another. Fundamentally, we’re making one cell — we’re kind of turbocharging the silicon cell. We’re adding more current into the silicon, as opposed to making two cells.”
The opening sentence is accurate -- "conventional" silicon solar cells are exactly what this work is improving.
Obviously this PR blurb was intended for a broad audience; that is why universities write and publish blurbs like this. And it's great that they try to make research a little more approachable!
But since it is intended fora broad audience this really is misleading. Probably not maliciously so; the job of a PR department is to make things sound great and impactful after all, and they did that. How much to you want to bet the most actual researchers would never have used words like this to describe their breakthroughs to non-experts they know and respect, at least not without really emphasizing what a "cell" is and is not?
I need to ask my brother how this new method compares to his work on high efficiency heterojunction solar cells. While this is all really interesting physics, it quickly races past my electrical-engineering-level of understanding.
(the apparatus he made to build these heterojunctions one-layer of atoms at a time (molecular beam epitaxy) was crazy; you're in the sci-fi future when your project needs a high-purity sapphire substrate and vacuum with a mean free path >10km)
 (Lang) https://aip.scitation.org/doi/abs/10.1063/1.3575563
One photon is at one wavelength. So multiple wavelengths implies multiple photons.
This breakthrough is specifically the ability to do more with a single photon.
That's a huge leap.
In multiple layers setups, you tune multiple separate bands one for each layer. Electrons in different layers capture different amounts of energy.
This also lets you also have multiple bands, but rather than separate layers you’re using numbers of electrons to determine the thresholds.
PS: Also the efficiency bands are based on sunlight, a different light source changes the maximum efficiency.
The actual antenna structures and reaction centers are reasonably efficient, at least over certain wavelengths. Even through electron transport and the whole photosystem, things still aren't too bad.
But overall efficiency tanks because of Rubisco. Even with C4 and CA metabolism, oxygenation is a major limitation.
I seem to remember something about matching up photon energies so that extra energy isn't wasted, and multiple frequencies can contribute usefully.
Edit: here's a link to some info about his work:
It is relevant once photovoltaic energy becomes the main power source for the humans, because the materials used in current photovoltaic cells are dramatically more abundant than hafnium.
Generally I wonder about sustainability of rare earths (and other elements with limited supply like helium). We build a lot of products that don't last very long - some of it gets recycled, a lot of it doesn't. Wonder what humans 200 or 2000 years later will think of our consumption
Well, it varies by element.
While some of the phosphorus makes up human bones, most of it is dissolved in the ocean as phosphate, enormously less concentrated than the original deposits of apatite and other minerals it was mined from. At some point the humans might run out of easily accessible apatite deposits, since phosphorus only makes up 0.1% of the Earth's crust. If we consider only the top 2 km of the continental crust (29% of the surface) to be "easily accessible", that's only 3.0 × 10⁸ km³ of rock, or about 7.1 × 10¹⁷ tonnes, of which about 7.1 × 10¹⁴ is phosphorus. Since phosphorus is currently being mined at 153 million tonnes per year, the humans are currently on track to deplete most of it in only 4.6 billion years, considerably earlier than the Earth is on track to be destroyed by the sun.
As for the carbon, it's mined for its chemical potential energy, which is lost when it is burned; but the carbon remains on Earth. 870 billion tonnes of it is in the atmosphere, of which about a third is due to the humans setting rocks on fire; a comparable amount has been added to terrestrial biomass and carbon in the oceans. The part in the atmosphere is relatively easy to recover; existing processes have comparable cost to the cost of digging up rocks. This is not being done at scale because it is "unprofitable".
More than half of the aluminum that has been mined on Earth is still in use. The other half is mostly in landfills, where it is less concentrated than the bauxite ores that are commercially mined, but less oxidized.
Most of the iron that has been mined is in landfills, but not only ores but also slag heaps near foundries are a more concentrated form of iron.
Rarer metals like gold, hafnium, zirconium, and erbium are mostly in landfills. They mostly do not readily form soluble ions to leach into the ocean or groundwater, and they mostly only find their way into the air when incinerated. Their concentrations in landfills are dramatically higher than the concentrations in the ores they are mined from, and this is becoming a "profitable" way for the humans to spend their lives before being killed by the hazards of the landfills. Presumably automation will speed up this process.
In most natural systems there is no waste - everything gets recycled. We create a mess digging for elements around the planet, consuming the “value” of the things we make from those elements (and large amounts of energy) in a way which lasts a very short time, and disposing of the waste which then mostly get spread around the planet in even greater mess. I find it difficult to see how automation will easily solve our waste problem - it seems many orders of magnitude easier to try to solve it before human waste gets to a giant garbled heap or gets spread around the atmosphere/oceans/groundwater/(soon low earth orbit) where it’s difficult to retrieve, and seems to cause huge problems to the natural world (and ultimately is) by being in places it shouldn’t be.
Why are humans so good/bad at creating a mess? Are we equipped with skills/incentives to solve this conundrum, or will we continue to drown in our mess?
It's true that recovering materials from the atmosphere and oceans is more difficult than mining them. But most of the materials we're discussing don't end up in the atmosphere and oceans; they end up in landfills. And recovering them from landfills is, generally, much easier than mining them from natural deposits.
Landfills are much more concentrated and heterogeneous deposits of "valuable" elements than the ores from which they are mined; for example, a single accidentally discarded catalytic converter may contain 5 grams of platinum-group metals, which are reduced and relatively easy to recover; commonly mined natural ores of platinum-group metals (the Merensky, chromitite, and contact types of deposits) typically contain 5 grams per tonne of ore, mostly oxidized and thus requiring not only froth flotation but also smelting. And catalytic converter platinum-group elements are far from the only "valuable" elements in landfills.
The trouble is that when the humans go digging around in landfills with their bare hands, it reduces their life expectancy a lot, because of poisons, injuries, and poor safety practices around refining. This is not a new problem for mining — the Spanish silver mines were a death sentence for the enslaved indigenous Americans sentenced to work in them — but the particular measures needed to solve the problem are different for landfills than for other mining. Automation will largely solve them. I suspect that the same ritual-pollution taboos that cause the humans to deprecate garbage collectors, cannibals, and undertakers are a major factor in the slow development of landfill mining.
A Kessler syndrome in LEO will be only a short-term problem, as the lifetime of objects there is limited to decades, not even a single million years!, by atmospheric drag, and the quantity of mass there is limited by launch "costs". MEO Kessler syndrome would be a more serious problem.
Doing things creates messes. Lowering entropy locally — one definition of life — raises entropy globally. Doing more things will create more messes. Doing things intelligently can create safely contained messes.
I knew there was a high likelihood that I would be downvoted to oblivion for making a terrible Dad joke, but it was worth it. I feel that a few years back I could maybe have cracked that joke without my karma being bludgeoned.
Didn't know that, thanks, you'd have thought I'd have worked it out by now, but then it took me far too long to work out what a 'fawcet' was.
Re karma. You will live a much happier life not worrying about some random on the internet downvoting you.
The band gap of Si in solar cells is about 1.1 ev. To excite three electrons, a photon thus needs 3.3ev, or about 370nm.
The visible spectrum starts around 380nm or 390nm, depending on which source you believe. So the triple photon frequency would be very near UV.
According to this spectrum , there's a sharp drop in solar power around the 370nm.
It seems that with the 1.1ev band gap, exciting three electrons doesn't seem to be worth it. But, if you could reduce it just a bit (by doping, most likely), it might be worth considering.
Of course, reducing the band gap would likely lead to much worse performance overall, so one would need a multi layer approach, which sounds pretty complicated.
And then you need to pray that your glass cover doesn't absorb the UV light, which seems to be a pretty close call.
A photomultiplier has a pre-charged state where a photon triggers an avalanche -- not really relevant for photovoltaic applications.
The question was more if you can lift three electrons trough the band gap, triggered by a single photon, which would increase the solar cell's output current without dropping the voltage.
Sometimes there are decades between the "breakthrough" and the use of it. Most of the time the finale usage has not much to do with the initial breakthrough.