Having worked in hydrology (in forecasting) for a year out of college -- yeah -- you might as well use tea leaves and Tarot cards. People can't even predict weather. Let alone: Snowpack + weather.
We've gotten a lot better on timing with better satellite data. But so far as precip amounts, and type? Let's just say it is not the exact science we want it to be.
Now we have powerful computers and lots of data freely available (well maybe less now thanks to Trump), I've always dreamed of running an old computer model say from the 1980s at home.
What is missing in this nice analysis is the economic impact. If 200k were evacuated, and roughly estimating they have $100k in real estate each for housing, and say $50k each for business real estate in the flood risk area, that comes to $30B. Assuming some will not be a complete loss, say $15B in real estate loss if the emergency spillway fails (neglecting job and economic activity impacts, etc).
Now if you have a 5% chance of failure in this next storm, the expected value of the loss is $750M.
This value now can be used to decide how much effort should be put into emergency repairs, something less than $750M.
Say these numbers are in the ballpark. Is the emergency repair effort taking place sensible? I suspect not. To avoid a loss in the billions, you'd expect a bigger effort. I'd expect every shotcrete contractor in the western US on hand, rebar being placed by helicopter, steel beams, 50 pieces of heavy equipment, temporary tanks of diesel fuel, and a real sense of emergency. Not seeing it.
Main point is to make an appropriate effort. Seems not too bad today, and think the storm isn't huge.
Shotcrete and concrete are difficult to place via helicopter. Both need to be pumped (or hand placed) to get a good product, which can't be done via helicopter.
5 billion gallons / 9350 km² is just 2mm. If that fell in one hour, the flow into the lake would be 5200 m³/s, except it would be lower since it takes time for the water to reach the lake.
The normal flow of the spillway is 4200 m³/s.
Have I done something wrong here, or missed something obvious? 2mm/hr is a pretty moderate rain.
Things get iffy once you have heavy rain for long enough that the soil saturates.
A: Weather and climate are not the same. Weather is individual, day-to-day atmospheric events; climate is the statistical average of those events. Weather is short-term and chaotic and is thus inherently unpredictable beyond a few days. Climate is long-term average weather and is controlled by larger forces, such as the composition of the atmosphere, and is thus more predictable on longer timescales. For the same reasons, a cold winter in one region does not disprove global warming.
As an analogy, while it is impossible to predict the age at which any particular man will die, we can say with high confidence that the average age of death for men in industrialized countries is about 75. The individual is analogous to weather, whereas the statistical average is analogous to climate.
To use the human lifespan argument, yes we can say the average lifespan for men is 75, but that's looking backwards. How good would we be at predicting future life expectancy? I'd say we're probably pretty bad at it.
Also, yes, we're trying to predict the average temperature of the earth with climate models. However, that average is determined by the climate in a number of different areas.
Someone please explain this to me.
The fact that businesses selling life insurance make money over the aggregate, despite sometimes losing money in the individual, seems to imply we're decently good at it.
You ask "how good are we at predicting future life expectancy?"
Unfortunately I can't do an analysis right now, so hopefully this qualitative discussion will help answer it for you.
If you look at historical life expectancy, grouped by cohort, you'll see a relatively smooth function with easily identifiable trends. So, group everybody into the year they were born, and graph their average lifespan.
Past trends do not guarantee future performance, however a smooth function implies an underlying order to these observations. This is the core thesis everything else stems from then; we assume there is a relationship between the year someone was born, and their average lifespan.
Now a function relating just the year they were born to their average lifespan is almost certainly too simplistic. Where they were born, their socioeconomic status, and so much more will affect that number. The problem we have is that many of these parameters are hidden to us. Even worse, we can't even say for sure what all the parameters are! What we can do, however, is estimate the effect of these parameters, and even estimate the effect of parameters we don't know exist.
This process is inherently fuzzy, as statistics and modelling from less than perfect knowledge must be, but the models that have been created have proved to have great predictive power. The existence and general profitability of the life insurance industry is evidence to that.
We use models all the time, and for the most part this goes unquestioned. Regardless of your model of the sun and planets, it better predict the sun rising tomorrow, or else it's got some pretty big gaps. If someone told you they predicted the sun would not rise tomorrow, you'd be rightfully discredulous. But you'd be as equally discredulous if someone told you it was impossible to predict the future, so we really don't know if the sun will come up or not.
We test our models on their predictive power. Even if our models were bad, we don't simply throw them away because they are not perfect. We work to refine them and make them more accurate. Asimov's essay on the relativeness of 'wrong' is well worth the read.
So, life expectancy is a decent model, and is constantly being improved upon as we learn more about the world. Something may come along and cause us all to die, or live forever, but that remote possibility is no reason to throw our hands in the air and say the whole exercise is pointless.
Similarly, our climate models may be inaccurate or not account for some unknown future event, but that is not a reason to stop refining them, nor a reason to say "we can't know the future so this is pointless."
I understand that you can model something and constantly improve it, but that still doesn't inspire confidence in the model's predictive ability.
Maybe a better answer would be: "Yes, weather forecasts are often wrong, but climate modeling from 10 years ago accurately predicted today's global temperatures with a margin of error of +/- 0.5%."
ofc, answering the question as well would still be useful.
To the specific example of the sun rising - for the sun to stop rising, the earth has to either become tidally locked about the sun, or the earth or sun must be destroyed.
One article says:
Scientists have reliable data on the Earth's rotational speed, based on observations of the sun's position in the sky during solar eclipses, going back some 2,500 years. Although the rotational rate hasn't declined smoothly, over that period the average day has grown longer by between 15 millionths and 25 millionths of a second every year. Even at the faster rate, it will take 140 million years before the Earth's rotation slows enough to necessitate a 25-hour day.
The real crux of this matter is predictive power. We can get a rough understanding of the predictive power of a model by asking the question "at what point does the expected error in our prediction make that prediction meaningless?" For example, how accurate does a population model need to be for us to be willing to use it to predict population distribution in 50 years? It's fair enough to say that our models are not capable of predicting the population distribution in 1 million years, but no model could and it's unlikely we would be able to utilise such a prediction even if it were made.
As I said before, the existence of a profitable life insurance industry is evidence that we are good at making lifespan predictions at the individual level, in aggregate. Every time a policy is written, it's like a wager is being made between the insured and the insurer. Yes the insurers lose some bets, and some years they may even lose a lot of them, but it's clear that the odds are stacked in their favour, for otherwise they would be unable to offer insurance at all.
If you were to bet on what the climate of this planet will look like next year, how would you make your prediction? If you were to offer odds, how would you structure the book to make sure you win?
Yes, the absolute accuracy of climate models is going to diminish over the next 1, 5, 10 years, so the error associated with a specific global average temperature prediction (for example) will increase. That doesn't mean the actual temperature is just going to fluctuate wildly! Next year the global average temperature may decrease. It is extremely unlikely that in 10 years time the average temperature will have decreased, even if there is drastic intervention. Any bookmaker looking to make a profit would be offering odds on how much the temperate increases, not merely on if it increased or not.
Yes, predictive power is going to decrease as time goes on. No, that doesn't mean models are useless, nor that we can't have predictions that become more likely as time goes on.
The weather is a particular point on the twisted surface; it is impossible to calculate what point you'll reach at a given time, for reasons that you'll find in a search for "Lyapurnov exponent".
The climate is the butterfly shape. If you run the simulation with different parameters, you can look and see how this changes.
The Lorentz equations are much simpler than a realistic weather model, but the same principles apply to the complicated models, and to the laws of physics that the atmosphere actually follows.
The timespan we used in the SQL query calculates the inflow/precipitation ratio based on the last few weeks and we use that as an input into the estimate.
This guy seems to have testified about this flooding damage and claims 30 inches of rain around Oroville, with 18 inches of coincident melting, thus exceeding their worst-case scenario.
We see the same thing in NOAA's regional snow analysis maps:
Approximate watershed is seen in the map at: https://en.wikipedia.org/wiki/Feather_River
Outflow rates should have an associated catastrophic failure probability estimate
The catastrophic failure probability estimates simply haven't stood up to real-world testing.
#1: the dam is managed in the winter about 1 million acre feet below the top, for flood control purposes. The spillway has been operating all winter long.
#2: The operator cannot decide to use the emergency spillway. Water just goes over it when it approaches the top of the dam.
#3: no water has gone over the dam. That would destroy it.
The result is a balancing act — drain as much water as possible as quickly as possible while trying to minimize further damage to the spillway. The main spillway was returned to action, but at reduced capacity compared to what it could normally do under the current conditions. After all, the spillway needs to last for the remainder of the rainy season.
The stated capacity of the main spillway is 250,000 cubic feet per second, once it became clear the emergency spillway was not holding up, they increased the main spillway to 100,000 cubic feet per second. So limiting damage to the main spillway is an ongoing concern.
It's really freaky to stand in downtown Marysville and look up at the river bank 15 feet above you.
When the reservoir is at capacity, the dam operator has no control over how much water goes into the Feather River nor whether downstream towns are flooded or not. Any water not going down the main spillway will go down the emergency spillway and into the river.
(edit for typo)
> Even a small sustained overtopping flow can remove thousands of tons of overburden soil from the mass of the dam within hours.
Basically, when the dam over-tops the water will cut a channel into which a little more water will flow and cut a little deeper into which more flows...and on and on.
It's a run away failure and cannot be stopped.
Here's a quicker example of how just a tiny breach at the top will drain the entire dam:
Note in the first video it's the over-topping that causes the initial erosion/breach rather then the artificial one in the second video, but the mechanism of further failure is essentially the same.
I'm not sure the exact construction of the Oroville dam but it's clear the emergency spillway was concrete lined at the top likely with this sort of failure in mind.
I'd be surprised if the main bulk of the dam didn't have at least some measures to stop this kind of failure but someone with knowledge of it's construction can chime in on this point.
This was the coffer dam for the construction of Auburn Dam, work on which was halted, but the coffer dam never removed?
Wikimedia has a nice picture of the area:
The emergency spillway is on the far left, it is lower than the rest of the dam. So if water is flowing over the main structure, it's anyway already failed.
For an earthen dam, erosion of the top of the dam when overtopping is also, I believe, a significant issue, but overtopping is a catastrophic failure issue for any dam.
This image of the main spillway gives you an idea of what they're building on: http://i.dailymail.co.uk/i/pix/2017/02/13/08/3D252AE20000057...
You can see that they blasted into the rock to make the spillway.
However, the idea that erosion will undercut the emergency spillway and cause it fail implies that it's not built directly on the rock. This suggests that there could be more than 30' of draining.
>Water isn't passing over the dam
The question is about what happens when water passes over the dam.
Whichever part of the dam was lowest would see the fastest flowing water and would erode the most quickly. As a chasm was cut down through the dam, the water would flow faster and faster through it - if the height of the water above the bottom of the cut is 1 foot, only the top foot of water is "pushing" the water over; if the cut is 10 feet deep, now the top 10 feet of water is "pushing" water out the hole, and it will be pushed much faster.
This will turn into a feedback loop, and the water will rapidly cut a channel through to the bottom of the dam. As water flows through it, faster and faster, it will weaken the sides of the chasm, and more of the dam will collapse. This will let more of the water flow through, but slower, eventually limiting further erosion.
This is essentially the same thing happening in the spillways right now, but slower, because the spillways are designed to resist erosion. Once the vicious cycle gets started, though, the end result is the same.
If someone asks you what would happen to the people inside of a submarine if they let it fill with water, you don't respond by saying they design submarines to prevent them from filling up with water.
But there was some concern during design about tsunami. It's important not to overtop even in a brief event like that.
One piece that's missing from this analysis is upstream precipitation. You are sensing precip at Oroville itself, but water in the dam comes from everywhere upstream (3600 square miles according to ). As well as (potentially) snowmelt.
People doing forecasting of reservoir levels will be integrating distributed precipitation information, snowpack, and temperature, with a soil runoff and routing model to get predictions.
As you see the forecast flow is only about one third of that experienced on Friday past.
Here's a great article on climate change's effects on our water infrastructure in light of the Oroville Dam's situation: https://www.nytimes.com/2017/02/14/opinion/what-californias-...
An example - the bulk of California's water storage is in the natural reservoir of mountain snowpack. The large volume of precipitation this year, much of it as high-altitude rain falling on snow, has caused an extraordinary amount of snowmelt in a short period of time. The natural reservoir of snowpack is rapidly released into our manmade resorvoirs and strains or exceeds their capacity. This cycle of alternating extremes, drought to wet, is expected to continue, and our water infrastructure's capabilities must be planned in light of that.
Dry Creek basin seems to have some points were it's just a few meters of a hump over the emergency spillway level.
All of the concern about the 'failure' is a matter of loss of control of water release, not of breach of the main dam structure.
While that's still a lot of acre feet, and it would create a 'wall of water' situation in the first surge (hence the evacuation), it's not as significant as the dam failing and the entire reservoir emptying through it.
Seems like it would be pissing in the ocean (or I guess that analogy in reverse?), but when I saw the newsreels of the helicopters dropping bags of rocks into the spillway hole, that also seemed like it would take a couple thousand of those drops to make a difference.
I don't think you're going to be able to make a siphon quickly that is large and robust enough to handle a significant fraction of that.
However, this seems a little off: "Based off of our estimate, for every inch of rainfall at the Oroville dam, 136,790.5 acre-feet will be added to the reservoir." The relationship probably isn't linear. Much less of the first inch of rain makes it to the reservoir vs. the 5th inch of rain.
Hopefully, the next iteration will project how many inches of rain it will take this weekend to top the spillway again.
Edit: I think newer programs to model exist now. Some are here
165 page PDF describes it.
Depends on whether the ground is already saturated.
(That is, unless you're talking about plants absorbing water from their leaves directly, before the water hits the ground. I don't know that plants do that when their roots already have plenty of water, but I also don't know that they don't.)
The money shot is at 2:25
If the emergency spillway failed, it could lower the water level in the reservoir by 30 feet. That would be catastrophic downstream, of course, but probably wouldn't have much effect on the boats.
My wife, a Water Resources Engineer, is. If the damage to the spillway starts to move sideways, or uphill too far and too deep, you could see the main dam get damaged. It's not as likely as the emergency spillway dropping things 30 feet, as you say, but it's still on the table as a worst-case scenario here.
Well, let's hope for everyone's sake that this worst-case scenario does not happen!
They reduced the mandatory evacuation order but more and more emergency personnel are arriving. There are (unfounded at the moment) local reports of water seeping underneath the emergency spillway weir.
Acre-feet. Of course.
The only thing missing here is nautical miles and football fields.
Folsom dam has turned on the spill at insane rates the past few days. I wish people living in Sacramento county (my parents) would take things more seriously.
Sounds like many are still worried about it, but there's really no consensus.
Still, its a TSDB that I wasn't aware of, so it worked on me!
Actually a really practical way of showcasing a product without markitecture brochureware. A GitHub repo and links to Docker images.
No. At least not strictly speaking.
The dam is in no danger. The spillways are the problem. The main spillway is badly damaged, and the emergency spillway was in some danger of collapse on Sunday.
The main spillway is a long straight concrete chute with a gate at the top to regulate the flow. If the erosion in this spillway worked its way up to the top, it could damage that gate and operators would lose the ability to control the outflow.
The emergency spillway is a low concrete wall with a hillside below it. Its top is lower than the dam top, so water will go over this wall (and did on Sunday) instead overtopping the dam itself. The worry here is that erosion of this hillside could cause the wall to collapse.
Both spillways are separated from the dam by a large hill. Even if they failed, the dam would survive. So there wouldn't be a flood of all 700 vertical feet of the reservoir, but there could be a flood of 30 feet from the emergency spillway.
Of course if you were in the path of that flood, the distinction between the "dam" and the "spillway" would be small consolation.
The Metabunk page that seanp2k2 linked has some good information and discussion:
And the YouTube video that clamprecht linked has a good view of the dam and spillways on Sunday:
So if you are rooting for big dams to be destroyed by weather, you are just rooting for short term harm and suffering, there won't be any long term changes to the river systems.
Scale matters. Like all things, there's a "right size" for everything. We've been focusing almost exclusively on huge dams, but neglecting small [property scale] dams. At a smaller scale there are many advantages: significantly more 'edge' for beneficial biotic interactions (edges between biomes are more productive than either biome on its own), the topsoil can be economically removed during dam construction to avoid methane release and wasted biomass (impossible in a huge scale dam), runoff and on-farm erosion is reduced by strategic dam placement within the landscape, and the water can be directly used on the farm with no transaction overhead or canal network (just a simple system of gravity fed pipes).
>significant flood control+reservoir dams will certainly be rebuilt
I wouldn't necessarily adopt such a pessimistic attitude. Your analysis is predicated on the idea that
A) the society will have the resources to rebuild thousands of [often economically futile] dams, and
B) the society will neglect to account for the ecological and indirect economic costs of building a dam (reduced forestry yield due to phosphorus deprivation from salmon runs, increased insurance cost due to delta erosion, etc) versus the water resources and flood control abilities, which are constantly degrading due to silt buildup.
It's certainly a fair criticism of my phrasing, but having your home washed away due to weather is quite different than being compensated to uproot.