After announcing the project back in September, we have now learned that Tesla and Southern California Edison (SCE) have completed the massive 80 MWh energy storage station using Tesla’s new Powerpack 2 at the Mira Loma substation.
...
While Tesla and SCE haven’t officially launched the new substation yet, sources familiar with the new Powerpack installation told Electrek that it was completed a few weeks back – late December – and brought online so that the electric utility can start using it to manage peak demand.
Deploying this project in only 3 months is really impressive.
I think there's a pattern here: energy systems built from repeating small modules have systematically lower risks of cost/schedule escalation.
See for example "Construction Cost Overruns and Electricity Infrastructure: An Unavoidable Risk?"
Look at Figures 2 and 3 in particular. The mean overrun by project type goes solar < wind < thermal (fossil) < nuclear < hydroelectric. That's basically the same ordering as "minimum viable project size." Coincidence? I don't think so. A small solar project can be a handful of panels. If you want to scale up to hundreds of megawatts, just tile a much larger area using the same panels and basic repeating structures. A lot of work can execute in parallel and sections can start generating output/revenue rapidly as they're completed, sometimes years before the project's last panel is installed. No such luck with a nuclear or hydro project: you can be on year 7 of an 8 year schedule and it's still generating zero watts because so little can happen in parallel. (This is also why I consider small modular reactors the last, best hope for a nuclear renaissance, but the latency for nuclear R&D is so high that renewables might "eat the world" first.)
I rather expect that this Tesla installation, despite its impressive speed, was not particularly cheap. But if battery costs keep declining, in 5 years it could still be lightning quick to build another installation like this and cheaper than gas/diesel peakers. If that happens then storage-backed renewables become the very cheapest electricity source across large swaths of the Earth, threatening huge chunks of coal and natural gas demand. There may be a parallel risk to oil demand if Tesla and other auto makers execute well on their EV plans currently in the pipeline. I think Tesla faces some significant risks from the SolarCity acquisition, and execution risks on the Model 3, but I sure can't fault the ambition.
> The mean overrun by project type goes solar < wind < thermal (fossil) < nuclear < hydroelectric. That's basically the same ordering as "minimum viable project size."
Relatedly, MIT Prof. Dennis Whyte argues that this is a/the key friction for the development of fusion power in this excellent semi-popular talk
In particular, he thinks the latest generation of superconducting magnets will lower the minimum-viable size of fusion power plants, which will have a big impact on the feasibility of them being built by human institutions even if their raw numbers on paper are comparable to older style tokamaks.
Terrific comment, as usual. This is the key insight everyone. It's not about where the puck is today (of course this project is small and expensive relative to mature tech), it's about where the puck could be in 5-50 years. If battery costs continue to fall with economies of scale, and they become the least-cost solution, they would be bring characteristics that today's least-cost solutions lack. Batteries are modular, dense, and can be deployed extremely quickly. You don't need a years-long permitting and construction process to deploy them. If they continue following the cost curve they're on, they may change how grid planning is done, and even change the grid's very topology.
In the Christensen sense, batteries may be truly disruptive - a new technology that is initially inappropriate but brings with it undervalued characteristics (like deployment speed/modularity) that eventually can change how business is done.
> You don't need a years-long permitting and construction process to deploy them.
Not sure where you live but this definitely is not the case in all areas. In my town the local utility has been trying to get approval to build a substation (standard design by any criteria, and needed to support increased demand) for going on two years but nobody wants it near their neighborhood.
It has a listed cost of 1.6 billion USD built in 1977-1985, a capacity of 70,000,000 cubic meters of water (the sum of the lower and upper reservoirs on the wikipedia page) which at the max flow rate of 51,000 cubic meters per second is ~24 hours of generation at capacity. The page lists the generation capacity as 3000MW and so you can calculate a storage capacity of 72,000 MWH.
These numbers give $533,333/MW of generation capacity and $22,222/MWH of storage.
The article seems to show the cost for the Tesla station at $38 million. This gives $475,000/MWH of storage capacity and $1,900,000/MW generation capacity.
Based on the wikipedia article, I think the 1.6bn USD is in 1977 USD. That's going to be roughly 4 times as much by 2017 [1], or 6bn USD, giving about $2,000,000/MW capacity at Bath vs $1,900,000/MW for Tesla.
Compensating for the fact that only the upper reservoir provides power, storage cost is around $180,000/MWh for Bath vs $475,000/MWh for Tesla.
Still significant but at least in the same order of magnitude, and this technology is still improving more rapidly than pumps and turbines. And, the batteries I could put close (or in) cities, potentially saving another 5-20% on transmission losses.
I'm now wondering how the operational cost of this will look. Rotating equipment will show wear and I'm assuming Bath has quite some cost on that side. However, that thing has been in operation for over 30 years, and tesla only gives guarantee for 5000 cycles on the powerwall, or about 15 years (assuming 333 cycles a year). If you would actually need to replace the batteries after 15 years then I don't see them beating pumped hydro in the short term, if you can use them for 10'000 cycles with just a loss of capacity, then I think the finances can make a lot of sense.
This battery project can be built in small increments and does not require local land in a proper geometry that can be flooded. Most good pumped storage possibilities in the US are built out already. Pumped storage also wastes more of the stored energy.
The recently rebuilt (2010) Taum Sauk reservoir only manages a 70% efficiency according to Wikipedia. I'm not seeing a number published for the Powerpack but I'd guess a 90% ballpark.
While those are interesting comparisons, keep in mind pumped storage has other negatives that make it less appropriate for some needs (besides the fact that pumped storage has a larger ecological impact).
Batteries can be installed much faster: it's been 4 months since the announcement of Tesla's selection making it ~ 24 times faster to implement than that pumped storage station.
Because they don't have to rely on natural features and since they're smaller so they can be scattered more, they can be placed nearer to the population they serve or their power sources.
But these two generators don't deliver the equivalent power, 72,000 MWH versus 80 MWH so 900 times difference. Are we going to say that to generate the equivalent power the battery plant would need to be 900 times bigger and therefore take up to 900 times longer to build? 900 x 4 months = 3,600 months = 300 years.
> they're smaller
Again, these two generators don't deliver the equivalent power, 72,000 MWH versus 80 MWH so 900 times difference. Are we going to say that to generate the equivalent power the battery plant would need to be 900 times bigger or that we'd need 900 of them, each with their own facilities and connectors and security placed near the population they serve?
None of these numbers make any sense, because they don't account for the longevity of the capital investments (10 years? 60 years?), as well as maintenance costs.
You're off by at least a factor of 5 on the $/MWH for this pumped storage plant, even without the longevity and O&M. A factor of 2.2 on the cost, and a factor of 2.4 on the capacity. The $1.6B cost is in 1985 dollars[1]. In 2016 dollars, the cost is $3.5B [2]. The capacity at 3GW is only 10 hours, not 24 hours [3].
The solution is to look at Lazard's analysis of the levelized cost of energy storage. [4]
Note that these costs are per MWH delivered to the grid, accounting for all the costs over time of that system.
When lithium ion battery prices fall by 50%, they now match the most cost effective storage out there. This will happen within 10 years.
It's amazing how fast the technology is changing. Pilot projects like this one should be happening all over the country right now, because within 10 years, it will be the only cost-effective way to do these things.
Hydro is always going to be superior to pretty much anything out there. The problem with hydro is that we already tapped every possible source for both generation and storage. It's done.
You can put a Tesla station anywhere in whatever size you need.
At the current rate of tech change, which has been sustained for decades, within a couple decades hydro will be more expensive than battery technology. And that's even with the vastly improved scalability, responsiveness, and sitability characteristics of batteries!
We need to build these industries, as they are our future. And if we're not building them, we'll just be buying from somebody else.
Bath County is not the World's largest pumped hydro storage plant. At 24GWh it is smaller than TVA Raccoon Mountain, which is 34GWh. There may be bigger ones, but unfortunately nobody makes rankings of the energy stored in pumped hydro plants.
Re-read the description of Bath County and you will find that they pump from the upper reservoir into the lower reservoir. Therefore, adding the capacity of the two reservoirs and dividing by the flow rate is a bad assumption.
I can't help but wonder (in the sense of curiosity, and in the sense of optimism) of the impacts of Tesla's battery and battery manufacturing technology, particularly in the wake of Samsung's recent battery quality issues. AFAIK, US manufacturing is known for higher costs - and higher quality.
Between all the new* uses for batteries, and the Gigafactory....
I wasn't enough of a webdev back then to being paying attention, but, does this feel like Amazon's AWS early days?
* Off-peak storage for on-peak discharge isn't exactly new, but why hasn't it been as big a thing? Is it just the sexiness of Tesla causing it to make the news, finally?
> Off-peak storage for on-peak discharge isn't exactly new, but why hasn't it been as big a thing?
> Is it just the sexiness of Tesla causing it to make the news, finally?
I'd guess simple costs. Tesla is actually pushing $ / MWh the hardest. They aren't high cost and high quality, they are low cost high quality. At least, that's what I gather from reading Musk's biography.
I don't know why this article keeps using the word 'massive'. 20MW is tiny at utility scale. For comparison the typical pumped-storage station has 1,000-3,000MW capacity.
Pumped-storage exploits pre-existing landscape features. Battery storage is essentially building storage from scratch in places where no favorable geological features exist.
How much would digging a mineshaft, a bedrock cavern and a lake or additional caverns just to pump water up and down cost?
80 MWh is massive for a battery. MW are a measure of power, not energy. You mean MW-h. It's incorrect to say something has a MW capacity when discussing energy storage.
> MW are a measure of power, not energy. You mean MW-h. It's incorrect to say something has a MW capacity when discussing energy storage.
He's quoting TFA,
> With a capacity of 20 MW/80 MWh
It would seem sensible that the system has both metrics: a total capacity of storage, and a total rate at which it can supply that storage to things connected to it.
Further, my understanding is that the parent you're responding to is correct: pumped-storage hydro tends to have much greater generation rates; Wikipedia lists an example of 360MW.
Batteries have inherently high MW per MWh so you can usually ignore the power capacity. But that's not true with everything. Both numbers are important.
I've got a good idea what the cycle limit is at normal discharge rates for these batteries, but the tricky thing in the calculation here is that the discharge rate is around 0.6C. That's very low (batteries of a similar size can put out fifty times that comfortably.)
Discharge rate affects the usable cycles. So with an 80% cutoff 10C discharge you might get 300 cycles, but with a 1C discharge you might get 500 and a 0.5C discharge you might get 600. Given the very low discharge rate, I'm guessing these will be ran for around 500 to 1000 cycles, but that's just a rough estimate and the real number could lie anywhere between 300 and 3000. Temperature, pressure, charge rate, etc. all affect this.
I looked up this "C rating". 0.6C means that the battery will be completely discharged in 1/0.6 hours.
Well, more precisely, it means that it is producing 0.6 * X amps where X is the battery's capacity in amp-hours. Since discharging a battery at a rate of 0.6 * X amps usually produces less heat than discharging it at X amps, the battery is (usually) slightly more efficient, which means it will usually continue to provide power for slightly more than 0.6 hours.
Hopefully, the person I am replying to will correct me if I have made an error here.
Yep, all good except I think you meant to say "1/0.6 hours" in that second-to-last sentence. Another thing worth keeping in mind is that discharging (most) Li-ion batteries below a certain voltage is harmful, and thus you can only usually "get" 80% or so of the energy out of the battery.
In simple terms, C rating is how fast you're draining the battery. Most batteries are rated for between 1 and 10C. This is the runtime you'll usually get from C ratings: 0.1C/10.5hr; 0.2C/5hr; 0.3C/3hr; 1C/55min; 1.5C/35min; 2C/25min. You'll notice how this doesn't track the expected values perfectly; for example, at 0.1C you get 5% more runtime but at 2C you get almost 20% less. This is because at 0.1C heat is negligible, but at 2C you're discharging almost 30 watts per cell. A good rule of thumb when running over 0.2C is to subtract five minutes from however long the cell "should" be able to run. You can probably now see how discharging at over 10C is tricky.
Discharging at a lower current produces less heat and increases the cycle lifetime, as well as a few other things (e.x. decreasing the risk of explosion.) Since Tesla is discharging these at 0.6C, they will get more cycles out of them than if they discharged them at 1C or 10C, for example.
The number of cycles is very important here because these will be used at least one cycle per day, and given that most batteries have a lifetime of 200-500 cycles this is a maximum of 200-500 days. Given that there are 16-20 million dollars worth of batteries here, that represents a significant yearly cost.
How do you know about Tesla's cell cooling system, and how do you know how much the utility is planning on using the battery? Or do these two things not matter?
So what does that translate to in years of operation? Does Tesla basically get a nice recurring income stream with battery replacements every 5 - 10 years?
Tesla is so hyped that 20MW/80MWh is bombastic news (by the wording in the article), a mean sized power plant is ~800MW. The grid is not interconnected ?
Can't they pump up a hydroheletric reservoir ?
I sure hope the new administration open the eyes for carbon sequestration and nuclear energy, US will be better and cleaner faster.
We're going to at least see a nuclear push under the Trump Administration. They've touted that on numerous occasions. [1] The carbon sequestration I suspect is entirely off the table.
...
While Tesla and SCE haven’t officially launched the new substation yet, sources familiar with the new Powerpack installation told Electrek that it was completed a few weeks back – late December – and brought online so that the electric utility can start using it to manage peak demand.
Deploying this project in only 3 months is really impressive.
I think there's a pattern here: energy systems built from repeating small modules have systematically lower risks of cost/schedule escalation.
See for example "Construction Cost Overruns and Electricity Infrastructure: An Unavoidable Risk?"
https://www.researchgate.net/publication/262304928_Construct...
Look at Figures 2 and 3 in particular. The mean overrun by project type goes solar < wind < thermal (fossil) < nuclear < hydroelectric. That's basically the same ordering as "minimum viable project size." Coincidence? I don't think so. A small solar project can be a handful of panels. If you want to scale up to hundreds of megawatts, just tile a much larger area using the same panels and basic repeating structures. A lot of work can execute in parallel and sections can start generating output/revenue rapidly as they're completed, sometimes years before the project's last panel is installed. No such luck with a nuclear or hydro project: you can be on year 7 of an 8 year schedule and it's still generating zero watts because so little can happen in parallel. (This is also why I consider small modular reactors the last, best hope for a nuclear renaissance, but the latency for nuclear R&D is so high that renewables might "eat the world" first.)
I rather expect that this Tesla installation, despite its impressive speed, was not particularly cheap. But if battery costs keep declining, in 5 years it could still be lightning quick to build another installation like this and cheaper than gas/diesel peakers. If that happens then storage-backed renewables become the very cheapest electricity source across large swaths of the Earth, threatening huge chunks of coal and natural gas demand. There may be a parallel risk to oil demand if Tesla and other auto makers execute well on their EV plans currently in the pipeline. I think Tesla faces some significant risks from the SolarCity acquisition, and execution risks on the Model 3, but I sure can't fault the ambition.