If we're using clean power to power extraction of CO2, wouldn't it just be better to connect that clean power directly to the grid and thus reduce the required production of CO2 at other power plants? Otherwise we'll be having massive efficiency losses.
Is that really the case though? I mean the 'catastrophic' part, not the 'climate change' part as that has been clear from the get-go - climate change is happening and has been happening for as long as the planet has had an atmosphere. We're in an interglacial, on the far end of the high point (the 'interglacial optimum') on the curve and with that on the way down to a new glacial in a few thousand years. The interglacial optimum is the period within an interglacial that experiences the most 'favourable' climate and often occurs during the middle of that interglacial. For the present interglacial, the Holocene (which started about 11.700 years ago) the climatic optimum occurred during the Sub-boreal (~3000 BCE - 500 BCE) and Atlanticum (7000 BCE - 3000 BCE) [1]. Our current climatic phase following this climatic optimum is still within the same interglacial. That warm period was followed by a gradual decline until about 2000 years ago, with another warm period until the Little Ice Age (1250–1850). The current interglacial is actually relatively cool compared to the previous known ones, the maximum Antarctic temperature during the previous interglacial was about 6°C higher than that registered during the current one [2].
Can you point us at some non-politicised sources which can refute these findings, which prove that anthropogenic climate change is such a strong force that it counteracts the natural glacial cycle?
For those who wonder why I ask this question please note that this is a serious request. The debate around climate change has been so polluted by politics that it is nigh impossible to find objective sources. Those objective sources which I know off fail to show the climate catastrophe which we're all being warned about.
Thanks for the lecture on historical climate. Those graphs you have linked as objective sources aren't particularly useful nor do they illustrate the full impact of anthropogenic CO2 emissions.
One of the biggest problem with clean power i.e solar and wind is storage. Using excess power to use for carbon capture during low demand could be the way to go. If I recall correctly feeding some types of algae to cattle decreased methane production. So solar power production companies instead of selling excess power power for zero or negative can produce algae to sell as cattle feed which also decreases methane production and decreases the water required to raise cattle and its feed. As clean water is also something people are running out of.
But the carbon isn't released into the atmosphere when trees rot. Dead trees turn into soil. Most of a dead tree's carbon will be bound to the ground as soil.
There are two carbon cycles, the biological one (fast) and the geological one (slow). In the biological cycle carbon enters food chains through autotrophs (self-feeders, nearly all of them photosynthesising organisms like algae and plants) which capture carbon dioxide from air or bicarbonate ions from water and use them to make organic compounds such as glucose. These in turn get eaten by heterotrophs (other-feeders) such as animals who consume the organic molecules and break them down through cellular respiration where carbon in the molecules is released as carbon dioxide. Decomposers (fungi and bacteria) also release organic compounds and carbon dioxide when they break down dead organisms and waste products. Rinse, repeat.
I am not sure that is true. There has been some change in opinion on whether old growth continues to be a carbon sink. I don't have a good ref handy, but found this:
"Until recently it was believed only young forests sequestered atmospheric carbon in early growth and that old-growth forests were only sinks in which the carbon was stored. Recent studies, however, have identified that intact old-growth forests continue to take up carbon from the atmosphere even past the point at which they reach maturity. By measuring growth rates, researchers have identified that carbon sequestration in trees increases continuously because the overall leaf area increases as they grow, enabling bigger trees to absorb more carbon from the atmosphere. Older, larger deciduous trees reproduce more new leaves, thus capturing the most carbon from the atmosphere."
Interesting, how much CO2 is released producing this?
And there is no mention of what electricity supply should be?
How both of these factors are comparing to just planting an acre of trees?
> Advanced low cost molten salt thorium rectors will provide cheap electricity
...that still don't exist. Those that we know how to make have still so serious problems that they aren't being made for anything but the research:
"Basically, MSRs are underdeveloped and require a lot more research (especially in corrosion) before they can surely take off as the world’s fleet of power plants."
"The problem with MSRs, then, is that the fuel is already completely cut open and melted. You’re halfway to a bomb already"
By the same state of availability, it can be said "the advanced fusion reactors will provide cheap electricity."
EDIT: in reply to the argument below (as I can't answer there, the moderators like it that way):
> Thorium is ... also closer than wind and solar.
How? Both wind and solar are already available and usable today to every individual who has the property to install them, everywhere in the world. Who can buy and install his own thorium reactor on his property, and where?
Solar is not cheaper as it requires backup energy sources such as coal, oil, nuclear etc.
You can't just isolate the cost of solar or wind as they are only viable because they sit on top of a fossil or nuclear infrastructure.
Solar and wind is always depending on backup energy, those backup energies are not depending on wind and solar.
Wind and solar haven't factored in the cost of decommision and maintanance in their prices which is going to be much more given the huge areas they have to be spread out on.
There is plenty speculation about how to solve the intermittency problem of wind and solar but we are not even close to a solution for that.
Solar across large areas plus batteries does not have intermittency issues. Like all power sources you need to build out for the worst case, as power output can drop. But that’s true of Nuclear as well because power plants need refueling etc.
So, let’s hypothetically say you need 2x redundancy with solar generation to hit 100% grid demand 99.99% of the time. 2x 2c + 1.3c only bumps you up to 5.3c/kWh. And note this is with vast overkill wasting 50% of electrify generation. Except that’s not a comparison to coal or Nuclear that’s a comparison with the grid including peaking power plants.
Run the economics of 100% Nuclear and you need both excess capacity seasonally and vast amounts of grid storage daily. Thus vastly increasing the already high prices.
PS: Remember if the sun goes out we all die anyway, it’s at worst local clouds we care about. Further, demand for AC is lower on cloudy days.
Solar across ANY area has intermittency issues. It's not true for nuclear at all. The capacity factor of solar is 20-30, the capacity factor of nuclear is almost 100. They are not even in close competition.
Wind and solar is currently less than 1% of the worlds energy and not expected to be more than 3-4% in 2040.
There is no realistic version I know of (and believe me I have looked) where solar or wind become the base of societies energy needs.
No one is talking about 100% nuclear. But currently nuclear already cover much more of our energy needs than wind and solar.
Nuclear’s capacity factor is a problem because you can’t save money turning them off. If a Nuclear power plant can’t get 90+% utilization that’s a non starter it fails because it’s to expensive. On the other hand you can adjust the time of day a solar power plant gives power by moving it east or west and that 2c/kWh is after taking capacity factor into account.
Nuclear’s problem actually gets worse when operating alongside solar and wind which are dramatically cheaper and drive the spot price close to 0 regularly. Meaning Nuclear is unneeded for long periods. Thus the long term trend is ~0% Nuclear power due to economic issues.
As to photovoltaic capacity factor, nobody expects it to run at night that’s what batteries are for. The only meaningful capacity factor is the delta between average and minimum daily output vs seasonal demand.
PS: Solar thermal on the other hand is simply a boondoggle waste of money.
This is simply just factually wrong. Please show me one place that proves that high capacity factor is an issue. Just one.
I am not sure where you are getting your information but it's simply factually wrong.
Batteries can't solve this issue as it doesn't solve seasonality. And it would be extremely extremely huge batteries needed. The cost alone would make it impossible as peak demand only happens once in a while. And then you are back to backups.
It’s simple math, a Nuclear power plant that can produce power at 90% capacity factor at 10c/kWh can be used to produce power at a 45% capacity factor but at ~20c/kWh. In effect that high capacity factor is also a demand to maintain cost competitiveness.
Granted, you can operate nuclear when the spot price is below your true cost to produce assuming it stays above your marginal price, but you now need to make up the difference.
As to seasonality, demand is also seasonal, which again is bad for nuclear. Further, when sourcing wind you can locate it based on seasonal considerations. The high correlation between solar production and AC usage ends up being a useful feature not a bug.
France, even with massive exports and imports it still needed to curtail nuclear production on the weekends which raised prices. Most countries are wise enough not to instal significant Nuclear for this very reason.
France gets 75% of its energy from nuclear so no they don't have a problem. Low capacity factor is a problem not high capacity factor. Show me where it's recognized by a credible source that this is a problem.
Away, from the IAEA which is very pro nuclear “Some Member States, such as France and Germany, have already needed to design or convert the majority of their nuclear power plants for flexible operation. Plants in those Member States have been designed or modified and have operated flexibly, and many reactor-years of experience and knowledge of flexible operation have been collected.” Page 18 adds a host of reasons why Nuclear is bad at load following. Page 25 shows just how low weekend loads get vs peak demand. Fig 13 on page 33 shows “Nuclear Load following” and that’s a direct quote.
PS: Of note France also exports significantly more than it imports in part because they are so bad at load following. But even with that they still need curtailment of Nuclear power which is made worse by the large shifts in seasonal electricity demand.
If you want some actual number for France: In 2016 France had 63.1 GWe of nuclear capacity and they generated 403,000 GWh. 403,000 / (63.1 * 24hours * 366 leap year) = 72.7% useful capacity factor. Even with massive exports, France’s low capacity factor increased price per GWh by well over 20% vs the ~90% utilization Nuclear is generally designed for.
this directly says that its not a problem. Wind and solar doesent give you this choice but have to use coal, oil, gas and nuclear as backup. The reverse is not true.
As I said several times it’s the dependence on high capacity factors that’s the problem.
Wasting significantly more than 20% of all money spent on nuclear in France is a problem. This is a country that can export almost a 1/3 of production and they still end up paying a premium.
Run the numbers for Nuclear without exports and France would be paying for ~500+GWh of Nuclear generating capacity could only use ~300GWh per year, in other words to reach 70% of electrical need from Nuclear alone the price goes up by ~67%.
Thus large scale nuclear needs grid storage to be cost competitive just as much if not more than solar, except it also costs several times as much. This is why it’s a dead end.
So again you are simply factually incorrect and are just making up claims right now.
High capacity factor (Nuclear) means you can trust the stability and in this case the scaleability of the source.
Low capacity factor (Wind & Solar) means you can't trust the satbility of the source and it doesn't scale. Instead it needs backup from oil, coal, gas or nuclear.
No energy source is without its problems but you've seen to miss the point of this discussion which is what is the best source for stable energy supply. To claim that high capacity factor of nuclear is a problem is simply flat out wrong and you won't find anyone back that claim up.
Why is it necessary to replace all fossil fuels? You can keep gas turbine power plants around.
I support nuclear power but it's very very expensive, even in meteorologically inactive places such as UK. Maybe India and China can make it work, but in the west you can forget it. Hinkley Point C is a failure and it hasn't even been constructed yet.
> Why is it necessary to replace all fossil fuels?
"Just" in order to avoid the uncontrolled warming up of the Earth's atmosphere. As long as we use them, there will be more CO2 in the atmosphere, more warming and the destruction of the environment in which the humanity grew up (the climate was extraordinary stable since the civilizations appeared... until the last 100 years). We still don't exactly know how fast that difference can hit us, and how big issue for us it can get (i.e. the worst case) but what we know already is enough to know that the resolute action is needed even if the worst case doesn't happen, "business as usual" will be a serious, long-time problem:
"The human imperative of stabilizing global climate change at 1.5°C"
"increasing GMST to 3°C above the pre-industrial period substantially increases the risk of tipping points such as permafrost collapse, Arctic sea ice habitat loss, major reductions in crop production in Africa as well as globally, and persistent heat stress that is driving sharp increases in human morbidity and mortality"
"Our preliminary estimates suggest that the benefits of avoided damage by the year 2200 may greatly exceed energy sector investment costs to 2050. Current national voluntary emission reduction pledges for 2030 are insufficient to drive this even if followed by “very challenging increases in the scale and ambition of mitigation after 2030”"
"The majority of pathways for achieving 1.5°C also require carbon dioxide removal from the atmosphere. Delays in bringing CO2 emissions to net zero over the next 20 to 30 years will also increase the likelihood of pathways that exceed 1.5°C (i.e., overshoot scenarios) and hence a greater reliance on net negative emissions after mid-century if GMST is to return to 1.5°C"
Simply put, it's safer to not the let the CO2 in the atmosphere at all than to try to remove it later and fail. But eventually we will have to remove more than we let. And the amounts in question are immense, and removal costs a lot of energy too. So we do need all the energy to be produced without fossil fuels. Wherever some fossil fuels are burned, we'll need even more energy to remove that.
There is no base for the claim that it's safer to not let CO2 into the atmosphere. More people die from cold than from heat and then world has gotten 14% greener because of more CO2.
So if anything it's inconclusive and I don't see any scientifically demonstrated consequence of the climate-changing that we don't know how to deal with.
> world has gotten 14% greener because of more CO2.
"But that isn’t cause to celebrate. It’s a bit like hearing that your chemotherapy is slowing the growth of your tumor by 25 percent.
Despite global greening, carbon dioxide levels have climbed over the past two centuries to levels not seen on Earth for millions of years. And the carbon dioxide we’ve injected into the atmosphere is already having major impacts across the planet."
Sahara desert is greening. How is that a bad problem? Furthermore Sahara used to be a jungle and that disappear long before we ever started using CO2.
The climate is always changing. I am celebrating every day that fossil fuels have made life better and safer for us in the west. What I am fearing is that well meaning but fundamentally ignorant climate catastrophism will make it harder for the worlds poorest to get access to the very energy they need to fight any change in climate.
And the amount of greening that is possible is also a part of the models: we know that it can't save us from the increase in CO2 which is pushed to the atmosphere orders of magnitude faster. Even if in some areas some plants will appear, the climatic changes in the whole world will be seriously problematic for most of human population and most of the rest of the nature.
I know what a red herring is, that's not that it's a simple fact that goes counter to the claim about catastrophic climate change.
You have no proof for your claim, it's a non-testable hypothesis and the only thing you are relying on is computer climate models notorious for fitting and underappreciating negative feedback loops which have resulted in less than stellar predictability.
That’s not technically true. While unrelated, humans have been burning coal for at least 5500 years where the current Sahara desert is only around 4,500-8,000 years olds. Considering how long we have been using fires, coal use could actually vastly predate this event, though again the impact of such usage would have been minimal.
Again, several vastly more likely causes exist from livestock grazing to purely natural climate shifts.
The underlying causes of this drying and desertification has previously been attributed to subtle changes in the Earth’s orbit, which in turn influenced atmospheric weather patterns and led to a reduction of the amount of rainfall in northern Africa. But Wright, whose scientific research has led him to exploring Neolithic-age archaeological sites all over the world, suggests that this is not the full picture. “In East Asia there are long established theories of how Neolithic populations changed the landscape so profoundly that monsoons stopped penetrating so far inland”, explains Wright, also noting in his paper that evidence of human-driven ecological and climatic change has been documented in Europe, North America and New Zealand. Wright believed that similar scenarios could also apply to the Sahara.
Nuclear power is expensive in no small part because of the bureaucracy and safety measures that are required but also because of the policial headwinds.
Nuclear is expensive in the short run but much better in the long run. It's capacity factor is almost 100 compare that to wind and solar which is between 20-40 depending on season etc.
Furthermore nuclear is expensive because we compare it with the non-true cost of ex. wind and solar which both need backup energy to be useful (not calculated), the decommission (not calculated).
It's completely backwards that we are investing in making wind and solar cheaper not nuclear. Wind and solar are linear solutions to the exponential problem of energy. They don't point to the future but to the past. They are great for some things but never as a foundation for a modern society energy needs.
I don't think it's necessary to replace fossil fuels as such but many people seem to think so and they have the problem I tried to outline in my previous post.
These kinds of ideas just don't seem to make sense given the scale of the problem. It's as if some people imagine you could set up a few plants of for each power station and we'd be all square. When in fact: https://blogs.scientificamerican.com/life-unbounded/the-craz...
Exactly. I think we need to find a biologic solution. Maybe genetically modified trees / plants that spread quickly and capture carbon efficiently, terraform sahara, that kind of thing.
One of these machines could sequester about 10 tons of Co2 per annum. (1) According to the linked article we have roughly 25 billion tons of Co2 emissions yearly. If we take 5k cities in the world (2), and each had just one machine you could cut emissions down to about 500k tons per year, that is a 1e6 factor magnitude difference. If each city had a single "farm" of 20 of these (they are 3x3x7 after all for roughly 400 m^3 of space) you'd be down to 25k tons of Co2 per year.
None. You'd reduce emissions if it was not replaced by anything else and you don't look at the impact of what was being transported, but you'd not sequester any carbon.
Eventually excess algae would have to be removed. If you then just let them rot or convert them into food which is then eaten (both being the slow biological equivalent to burning), then the captured CO2 returns to the atmosphere.
I decided I will not criticize any attempt at carbon sequestration, because whatever the initiatives are, doing something is better than doing nothing, which is what we are currently busying ourselves with.