Hacker News new | past | comments | ask | show | jobs | submit login
Realtime Analysis of the Oroville Dam Disaster (github.com)
310 points by rodionos on Feb 15, 2017 | hide | past | web | favorite | 166 comments

I like what you're trying to accomplish, but you are missing a very helpful piece of information in your analysis. I'm sure you noticed how poor of a predictor rain fall is in your assessment of how much water will be added to Lake Oroville. This is largely because Oroville is filled from Sierra Nevada runoff. If a particular storm system is warm (as was the case with the latest one), it will mostly rain in the Sierras. This has a twofold effect: one, the precipitation doesn't stay put at higher elevations (as snow does), and two, it melts the existing snow, causing it to also be added to the downstream water accumulation. So a key part of forecasting is not only to look at expected precipitation, but also the expected snow levels.

"I'm sure you noticed how poor of a predictor rain fall is in your assessment of how much water will be added to Lake Oroville."

Having worked in hydrology (in forecasting) for a year out of college -- yeah -- you might as well use tea leaves and Tarot cards. People can't even predict weather. Let alone: Snowpack + weather.

As a meteorologist, let me offer this apology on behalf of my field: Predicting rainfall is really hard, especially in a particular location. Unlike dynamic variables like temp, pressure, or wind velocity, rainfall depends on just about every term, including sub-grid scale topography. We can have a good idea of how much total water a storm has to be precipitated, but when, where and how it falls is harder.

FWIW, predictions in my neck of the woods (the SF peninsula) are uncannily accurate. Rain is usually predicted with high reliability a week in advance, and we can generally rely on the hour-by-hour predictions a day or two ahead. I realize that our location is particularly easy to predict because of how our weather is dominated by the Pacific ocean, but you've got to take your wins where you can find them.

Most of what you said is the opposite of what the science shows. SF is particularly hard to forecast because its weather is dominated by the Pacific Ocean (in winter at least). In other places we get reasonably reliable 10 day forecasts, but because initial data (observations) are sparse over the ocean, we have less confidence in the forecast here, long range forecasts are less likely to verify. Satellites do help, but it's not like there's a vast network of Doppler radar buoys out there y'know? Anyway, we've gotten good at this despite the ocean, not because of it.

We've gotten a lot better on timing with better satellite data. But so far as precip amounts, and type? Let's just say it is not the exact science we want it to be.

What can I say? Our forecasts on the peninsula are accurate >90% of the time.

That depends a lot on what you consider accurate. It would also be correct to say that forecasts in the Bay Area are wrong 100% of the time.

FWIW, I've lived all over the world, and now the bay area. And I am consistently impressed with how good the weather predictions are for the area. So thank you for the hard work. I know I'm impressed.

I've already wondered how meteorology works in terms of computer models. Could you recommend some reading material that would be a good introduction for someone completely new to the field?

Now we have powerful computers and lots of data freely available (well maybe less now thanks to Trump), I've always dreamed of running an old computer model say from the 1980s at home.

Yeah, recently they predicted we'd get about 2-3 inches of snow in Portland, OR. We ended up getting approximately 11-12 inches.

Are you familiar with this blog? http://cliffmass.blogspot.com/ it's run by a UW meteorologist.

No, never heard of it. Thanks!

The proper approach with uncertainty is an expected value calculation. That way the uncertainty regarding rain within the watershed, the temperature, etc. can all be factored in.

What is missing in this nice analysis is the economic impact. If 200k were evacuated, and roughly estimating they have $100k in real estate each for housing, and say $50k each for business real estate in the flood risk area, that comes to $30B. Assuming some will not be a complete loss, say $15B in real estate loss if the emergency spillway fails (neglecting job and economic activity impacts, etc).

Now if you have a 5% chance of failure in this next storm, the expected value of the loss is $750M.

This value now can be used to decide how much effort should be put into emergency repairs, something less than $750M.

Say these numbers are in the ballpark. Is the emergency repair effort taking place sensible? I suspect not. To avoid a loss in the billions, you'd expect a bigger effort. I'd expect every shotcrete contractor in the western US on hand, rebar being placed by helicopter, steel beams, 50 pieces of heavy equipment, temporary tanks of diesel fuel, and a real sense of emergency. Not seeing it.

What would all these folks be doing? The shotcrete takes 7 days to cure.

Curing is logarithmic. News stories say they are using "concrete" today.

Main point is to make an appropriate effort. Seems not too bad today, and think the storm isn't huge.

Can you link to a news story that says they are using concrete today? What location are they placing concrete at?

Sure, the emergency spillway, but not much detail:


Seems like they are dropping super sacks of rock. They might be dropping super sacks of dry mix, but I highly doubt it - the money just wouldn't be there. If they're just looking to armor then the rock would more sense.

Shotcrete and concrete are difficult to place via helicopter. Both need to be pumped (or hand placed) to get a good product, which can't be done via helicopter.


I have no idea who is correct but this is not the right way to respond in any case.

Couldn't you put some kind of plastic membrane over the top of it once it's cured enough to not deform? The reason it needs so long to cure is to make sure the surface is hard enough to withstand erosion from turbulent water flow. If it's not directly exposed to the flow, you should be able to use it much sooner.

I'm no expert on concrete, but the first graph on this page indicates that even with two days of curing, shotcrete is quite strong:


I'm not trying to diminish how difficult it is to get precise estimates...but how precise do you need? A 1ft rise in water levels is almost 5 billion gallons of water. That's a pretty big margin for error.

The watershed for Lake Oroville is 9350 km² [1].

5 billion gallons / 9350 km² is just 2mm. If that fell in one hour, the flow into the lake would be 5200 m³/s, except it would be lower since it takes time for the water to reach the lake.

The normal flow of the spillway is 4200 m³/s.

Have I done something wrong here, or missed something obvious? 2mm/hr is a pretty moderate rain.

[1] https://en.wikipedia.org/wiki/Lake_Oroville

Not all the water (normally!) makes it to the lake, and the average rain over time is probably less than that. Otherwise, no, the math looks fine, but this is why the reservoir is useful as a flood-control system. It can take the fairly large flow caused by even moderate rain, and spread it out over time.

Things get iffy once you have heavy rain for long enough that the soil saturates.

Well, not all of it would go directly to the lake, for one. It would seep in to the ground, re-evaporate, etc.

Right. This is different. I never worked in CA hydrology during times when you worry about spill. CA Hydrology was always about allocating a very scarce resource to the people that need it most -- and offering your best prediction of how low reservoirs would go -- and praying snow pack would last. Now? W(who)tf knows.

Actually, there are EXTREMELY good models around rainfall, snowfall, SWE, rainholding capacity, and runoff. The forecasts I see for the Sierra basically nail snow:water ratios and snow levels to within 500ft.

Genuine question here, if predicting the weather for a localized area such as a small region Cali is "tea leaves and Tarot cards", why are we so confident in the prediction of the climate for the entire planet?

Q: If forecasts can’t get next week’s weather right, how can we trust predictions for decades or centuries from now?

A: Weather and climate are not the same. Weather is individual, day-to-day atmospheric events; climate is the statistical average of those events. Weather is short-term and chaotic and is thus inherently unpredictable beyond a few days. Climate is long-term average weather and is controlled by larger forces, such as the composition of the atmosphere, and is thus more predictable on longer timescales. For the same reasons, a cold winter in one region does not disprove global warming.

As an analogy, while it is impossible to predict the age at which any particular man will die, we can say with high confidence that the average age of death for men in industrialized countries is about 75. The individual is analogous to weather, whereas the statistical average is analogous to climate.


OK, I'm going to play devil's advocate. That answer doesn't really seem all that good since it doesn't address the predictability at all.

To use the human lifespan argument, yes we can say the average lifespan for men is 75, but that's looking backwards. How good would we be at predicting future life expectancy? I'd say we're probably pretty bad at it.

Also, yes, we're trying to predict the average temperature of the earth with climate models. However, that average is determined by the climate in a number of different areas.

Someone please explain this to me.

How good would we be at predicting future life expectancy? I'd say we're probably pretty bad at it.

The fact that businesses selling life insurance make money over the aggregate, despite sometimes losing money in the individual, seems to imply we're decently good at it.

Only in the short-term.

I'll take a stab :)

You ask "how good are we at predicting future life expectancy?"

Unfortunately I can't do an analysis right now, so hopefully this qualitative discussion will help answer it for you.

If you look at historical life expectancy, grouped by cohort, you'll see a relatively smooth function with easily identifiable trends. So, group everybody into the year they were born, and graph their average lifespan.

Past trends do not guarantee future performance, however a smooth function implies an underlying order to these observations. This is the core thesis everything else stems from then; we assume there is a relationship between the year someone was born, and their average lifespan.

Now a function relating just the year they were born to their average lifespan is almost certainly too simplistic. Where they were born, their socioeconomic status, and so much more will affect that number. The problem we have is that many of these parameters are hidden to us. Even worse, we can't even say for sure what all the parameters are! What we can do, however, is estimate the effect of these parameters, and even estimate the effect of parameters we don't know exist.

This process is inherently fuzzy, as statistics and modelling from less than perfect knowledge must be, but the models that have been created have proved to have great predictive power. The existence and general profitability of the life insurance industry is evidence to that.

We use models all the time, and for the most part this goes unquestioned. Regardless of your model of the sun and planets, it better predict the sun rising tomorrow, or else it's got some pretty big gaps. If someone told you they predicted the sun would not rise tomorrow, you'd be rightfully discredulous. But you'd be as equally discredulous if someone told you it was impossible to predict the future, so we really don't know if the sun will come up or not.

We test our models on their predictive power. Even if our models were bad, we don't simply throw them away because they are not perfect. We work to refine them and make them more accurate. Asimov's essay on the relativeness of 'wrong' is well worth the read.

So, life expectancy is a decent model, and is constantly being improved upon as we learn more about the world. Something may come along and cause us all to die, or live forever, but that remote possibility is no reason to throw our hands in the air and say the whole exercise is pointless.

Similarly, our climate models may be inaccurate or not account for some unknown future event, but that is not a reason to stop refining them, nor a reason to say "we can't know the future so this is pointless."

I appreciate the response.

I understand that you can model something and constantly improve it, but that still doesn't inspire confidence in the model's predictive ability.

Maybe a better answer would be: "Yes, weather forecasts are often wrong, but climate modeling from 10 years ago accurately predicted today's global temperatures with a margin of error of +/- 0.5%."

I'd say that's a worse answer, because while it explicitly answers the question, it doesn't address the flawed assumption in the question itself: predictive failure in micro implies predictive failure in the macro.

ofc, answering the question as well would still be useful.

Can you provide any evidence that "climate modeling from 10 years ago accurately predicted today's global temperatures with a margin of error of +/- 0.5%." I don't believe that statement. Perhaps you were saying wouldn't that be great proof if it were true.

I'm not claiming that's true, but if it were, it would be strong evidence that the modeling is accurate.

Right, you might say that our models predict the sun will rise tomorrow with a high level of confidence but you would have to give a slightly lower level of confidence for the sun rising the day after tomorrow and still lower one million years from now. All models have diminishing predictive power.

I agree with you, however our model of the solar system gives us pretty good confidence about macro level events even into the distant future.

To the specific example of the sun rising - for the sun to stop rising, the earth has to either become tidally locked about the sun, or the earth or sun must be destroyed.

One article[0] says:

Scientists have reliable data on the Earth's rotational speed, based on observations of the sun's position in the sky during solar eclipses, going back some 2,500 years. Although the rotational rate hasn't declined smoothly, over that period the average day has grown longer by between 15 millionths and 25 millionths of a second every year. Even at the faster rate, it will take 140 million years before the Earth's rotation slows enough to necessitate a 25-hour day.

The real crux of this matter is predictive power. We can get a rough understanding of the predictive power of a model by asking the question "at what point does the expected error in our prediction make that prediction meaningless?" For example, how accurate does a population model need to be for us to be willing to use it to predict population distribution in 50 years? It's fair enough to say that our models are not capable of predicting the population distribution in 1 million years, but no model could and it's unlikely we would be able to utilise such a prediction even if it were made.

As I said before, the existence of a profitable life insurance industry is evidence that we are good at making lifespan predictions at the individual level, in aggregate. Every time a policy is written, it's like a wager is being made between the insured and the insurer. Yes the insurers lose some bets, and some years they may even lose a lot of them, but it's clear that the odds are stacked in their favour, for otherwise they would be unable to offer insurance at all.

If you were to bet on what the climate of this planet will look like next year, how would you make your prediction? If you were to offer odds, how would you structure the book to make sure you win?

Yes, the absolute accuracy of climate models is going to diminish over the next 1, 5, 10 years, so the error associated with a specific global average temperature prediction (for example) will increase. That doesn't mean the actual temperature is just going to fluctuate wildly! Next year the global average temperature may decrease. It is extremely unlikely that in 10 years time the average temperature will have decreased, even if there is drastic intervention. Any bookmaker looking to make a profit would be offering odds on how much the temperate increases, not merely on if it increased or not.

Yes, predictive power is going to decrease as time goes on. No, that doesn't mean models are useless, nor that we can't have predictions that become more likely as time goes on.

[0] http://www.popsci.com/jessica-cheng/article/2008-09/ive-hear...

Another analogy: we can't look at a single atomic of Uranium and tell when it's going to decay - but that doesn't stop us knowing enough about the bulk nucleonics of Uranium to be able to build sophisticated reactors.

Thank you, that helped quite a bit. It makes sense, as predicting when a single individual will die (a single area will have <x> weather), is different than, the observed trend is males live to avg. 75 years. Makes sense to the layman!

For the same reason that, even though it's 'tea leaves and tarot cards' trying to predict the exact geometry of surface waves in a square meter of ocean, we can predict the time and height of a high tide with high accuracy.

Imagine a cannonball, on its flight through the air. Attached to it is a feather. We can't predict the fluctuations of the feather, as it is whipsawed by the airstream of the cannonball, but this has almost no effect on the trajectory of the composite.

Do a search for "Lorentz attractor". It's the result of an early numerical weather model.

The weather is a particular point on the twisted surface; it is impossible to calculate what point you'll reach at a given time, for reasons that you'll find in a search for "Lyapurnov exponent".

The climate is the butterfly shape. If you run the simulation with different parameters, you can look and see how this changes.

The Lorentz equations are much simpler than a realistic weather model, but the same principles apply to the complicated models, and to the laws of physics that the atmosphere actually follows.

The analyses do not appear to take the size of the watershed into account either. A large watershed drains into this reservoir. As such, distant storms can still have a large impact on the water levels. Time to concentration (the amount of time it takes for water to flow through the watershed) will play a large roll in the inflow rate.

As I understand it, the size of the watershed is known. Here's an estimate from a couple days ago: https://twitter.com/RyanMaue/status/831182017685495813

It is. The story of modern GIS systems is mostly about figuring out who dumped what <choose your own chlorinated organic compound> into a river.

And the expected temperature of the falling precipitation, and a whole lot of other meteorological stuff which is involved, so in reality this dataset, while admirable for effort, is really useless as there are way, way, way too many unaccounted-for variables.

This is exactly what happen in the 2013 Alberta Floods, lots of rain on deep snowpack upstream of Calgary.


Thanks for the suggestion. We had a discussion internally as to why inflows are higher this seasons compared to previous periods with comparable precipitation levels. We thought it might be due a multiple storms this season or because of discharges from the upstream reservoirs.

The timespan we used in the SQL query calculates the inflow/precipitation ratio based on the last few weeks and we use that as an input into the estimate.

The 1997 flooding in CA was mostly due to this issue of warm rain + snowmelt, so searching around for information about that might be good.

This guy seems to have testified about this flooding damage and claims 30 inches of rain around Oroville, with 18 inches of coincident melting, thus exceeding their worst-case scenario.


If I'm reading the data correctly, there is no snowpack above Lake Oroville right now, so it's all about rain at this point.


Looking at the map from the page you link to, I see significant amounts of snowpack at the stations above Lake Oroville: https://www.wcc.nrcs.usda.gov/webmap_beta/index.html#version...

We see the same thing in NOAA's regional snow analysis maps: https://www.nohrsc.noaa.gov/nsa/?region=Sierras

Thanks. This data seems to settle it - I'm seeing 20, 30, 40 inches at various sites within the watershed, at elevations from 5150 feet to 6800 feet.

Oroville itself is at 900 feet altitude, but the Oroville watershed extends into the Sierra Nevadas, including alpine valleys at 5000 feet. The Sierras have record snowpack this year (150 inches in places, so I hear).

Approximate watershed is seen in the map at: https://en.wikipedia.org/wiki/Feather_River

There isn't any snow left below 5,000 feet. Source: I'm in Reno at 4,415 feet.

Oroville is both north and west of where you are. Significant rain shadowing occurs east of the Sierra. and I think the "5000ft high" watershed number for Oroville may be low.

Yep all true. There's probably not any snow below even _6,000 or 7,000ft_ in the Sierras right now because the last storm, which was warm/rainy, melted it. This is another reason why the next storm won't put as much water into the reservoir.

I'm not sure where the snow is but it's not at 5000' in Plumas County this year. The ski and snowmobile club in La Porte is closed for lack of snow.

I'm having trouble pulling that map up, but I believe there's snowpack at 5500' and above right now, and there's a huge amount of land upstream of Lake Oroville that is above 5500'. My data is this: here's an Instagram tagged at Emigrant Gap (5200') taken four days ago.


I would note, as one example, the ground saturation is insane. I have a sloped backyard in Moraga, CA that goes to a creek which has been running non-stop since the first rains. It's been 3 days of sun and the ground is squishy and it's totally damp under the house. Pre-existing ground saturation will play a part in how fast the rain gets to the lake.

That's how the 1997 flood of Reno happened.


I was young, but I was there and remember it well. That flood was crazy. I remember being boxed in with my uncle and he had to do some crazy stuff to his truck's air intake on the fly just so we could get through a deep puddle and out of the box. In retrospect, we probably shouldn't have driven through deep water.

I could find one instance of a word beginning with "ero". An article elsewhere was highlighting the possibility of catastrophic erosion. Outflow rates should have an associated catastrophic failure probability estimate and then we can get a realistic cap on sustained outflows.

  Outflow rates should have an associated catastrophic failure probability estimate
The emergency spillway was originally rated for 350,000 cfs (cubic feet per second), but when it came into use, the erosion started causing a risk of catastrophic failure at just 6,000 - 12,000 cfs (this is what prompted the evacuation order, and the re-opening of the primary spillway to its capacity).

The catastrophic failure probability estimates simply haven't stood up to real-world testing.

Wow almost every statement here is wrong.

#1: the dam is managed in the winter about 1 million acre feet below the top, for flood control purposes. The spillway has been operating all winter long.

#2: The operator cannot decide to use the emergency spillway. Water just goes over it when it approaches the top of the dam.

#3: no water has gone over the dam. That would destroy it.

Regarding #2, they can choose to decrease flow out of the main spillway and let it go over the emergency spill way - From my understanding, that is exactly what happened over the weekend. They were trying avoiding continued damage to the main spillway.

Quoting from one of the LA Times updates linked from the link:

The result is a balancing act — drain as much water as possible as quickly as possible while trying to minimize further damage to the spillway. The main spillway was returned to action, but at reduced capacity compared to what it could normally do under the current conditions. After all, the spillway needs to last for the remainder of the rainy season.

The stated capacity of the main spillway is 250,000 cubic feet per second, once it became clear the emergency spillway was not holding up, they increased the main spillway to 100,000 cubic feet per second. So limiting damage to the main spillway is an ongoing concern.

The capacity of the intact spillway is ~250,000 cfs - the capacity of the Feather River through Oroville and Yuba City below, currently in the lower flood stages with 100,000 cfs outflows from the main spillway, is much lower than that. In addition to attempting to minimize further erosion of the main spillway, they set an outflow level that wouldn't flood the towns downstream.

It's difficult to describe the hydrological situation of Yuba City and Marysville to people who have never been there. The rivers run through the towns way up above the street level, owing to mining debris washed down from the mountains in the Gold Rush era, and later reinforced by Corps of Engineers works. So if the rivers overflow their banks in Yuba City, the city will be ruinously flooded.

It's really freaky to stand in downtown Marysville and look up at the river bank 15 feet above you.

Do they have release valves where they can release water and potentially cause minor flooding over a wide area instead of potentially catastrophic flooding wherever a "natural" breach occurs?

On Street View that doesn't appear to be the case at all. It looks like there's a 15-20 foot levee between Marysville/Yuba City and the Feather River.

Both the main and the emergency spillway send water into the Feather river.

When the reservoir is at capacity, the dam operator has no control over how much water goes into the Feather River nor whether downstream towns are flooded or not. Any water not going down the main spillway will go down the emergency spillway and into the river.

(edit for typo)

They can choose to use the emergency spillway. They just can't choose not to use it. That is, if water rises high enough, it will go out the emergency spillway, and they can't necessarily prevent the water from rising that high.

Why does water passing over the dam destroy the dam?

It's [1] an embankment dam, so basically just a pile of dirt/rocks. When the water goes over, the material of the damn goes with it [2]:

> Even a small sustained overtopping flow can remove thousands of tons of overburden soil from the mass of the dam within hours.

[1] https://en.wikipedia.org/wiki/Oroville_Dam

[2] https://en.wikipedia.org/wiki/Embankment_dam#Safety

Just too add: Water need not literally overtop the visible structure. The top few feet of the dam are not totally waterproof. Water pressure will get through material which is pourous enough to deal with rainwater and such. Rather than waves overtop, failure will star with small flows bleeding through near the top. These may run down inside the dam a little, only appearing on the surface a ways down the far side.

Here's a small-scale, but somewhat-satisfying view of what happens when an embankment dam is compromised.


Must be very small-scale, less than 1 pixel.

Here's one video showing the danger.


Basically, when the dam over-tops the water will cut a channel into which a little more water will flow and cut a little deeper into which more flows...and on and on.

It's a run away failure and cannot be stopped.

Here's a quicker example of how just a tiny breach at the top will drain the entire dam:


Note in the first video it's the over-topping that causes the initial erosion/breach rather then the artificial one in the second video, but the mechanism of further failure is essentially the same.

I'm not sure the exact construction of the Oroville dam but it's clear the emergency spillway was concrete lined at the top likely with this sort of failure in mind.

I'd be surprised if the main bulk of the dam didn't have at least some measures to stop this kind of failure but someone with knowledge of it's construction can chime in on this point.

In 1986, water flowed over the top of the Auburn Cofferdam and resulted in complete failure of the dam. See this video of that incident [1] for an example of how the overtopping of an earthen dam leads to failure.

[1] https://www.youtube.com/watch?v=tDmwo5nsWfQ

This seems to show that it was the "spillway plug" that eroded and failed. I don't know what a spillway plug is, but the narration seems to imply that it is designed to fail in emergency situations.

In the beginning of the video, the narration talks about the spillway plug being a feature designed to erode in a fashion that delayed/slowed the inevitable failure of the entire dam. In this case, the spillway plug helped the dam release water in a semi-controlled fashion over a series of hours, instead of an uncontrolled fashion in a matter of minutes.

If a embankment dam is overtopped, it is going to fail. They designed it with a slightly lower section on one end so they could be sure if it failed, it failed in a predictable way.

Interesting footage. There's a good before/after comparison at the end:


Woah. Could you give me some more VCR, please?

This was the coffer dam for the construction of Auburn Dam, work on which was halted, but the coffer dam never removed?

Excellent video!

The main dam structure isn't built to withstand the water flow.

Wikimedia has a nice picture of the area:


The emergency spillway is on the far left, it is lower than the rest of the dam. So if water is flowing over the main structure, it's anyway already failed.

It's an earthen dam. If water goes over the top, it will start to erode the dam, resulting in more water going over the top, resulting in catastrophic feedback.

Because it's an earthen dam, not concrete. Water going over the top will erode the dam, lowering the height, causing more water to go over, which causes more erosion...

My understanding is that water flowing rapidly down the vertical surface of the dam and impacting its base rapidly erodes the face and base of the dam, quickly reducing its ability to withstand the pressure behind it (which also must be at its maximum level for the dam to be overtopped.)

For an earthen dam, erosion of the top of the dam when overtopping is also, I believe, a significant issue, but overtopping is a catastrophic failure issue for any dam.

Water isn't passing over the dam, it's passing over a much shorter wall to the side of the dam with unprotected earth around and below it. Large volumes of water tend to wash away dirt and could potentially undermine this side wall. So there is a potential for release of a very large volume of water, without compromising the main dam itself, because the erosion cuts a new exit channel for the lake.

People have been saying that if the emergency spillway fails, it will release the top 30' (?) of the lake. What determines that height? Is there bedrock or something else expected to stop erosion 30' below the level of the emergency spillway or something?

Good question! According to this article, the bottom of the emergency spillway is indeed bedrock: http://www.drroyspencer.com/2017/02/why-the-oroville-dam-won...

Yes, it's basically on bedrock.

This image of the main spillway gives you an idea of what they're building on: http://i.dailymail.co.uk/i/pix/2017/02/13/08/3D252AE20000057...

You can see that they blasted into the rock to make the spillway.

However, the idea that erosion will undercut the emergency spillway and cause it fail implies that it's not built directly on the rock. This suggests that there could be more than 30' of draining.

The emergency spillway is a 30 ft concrete wall in the side of the lake.

If you look at the photos, some on Wikipedia, you can see the spillways are around a bend in the river from the main dam and pass over a high bedrock canyon wall.

>>Why does water passing over the dam destroy the dam?

>Water isn't passing over the dam

The question is about what happens when water passes over the dam.

If reverse Superman flew up and dropped a mountain in the spillways so the dam could be overtopped, the dam would begin to erode.

Whichever part of the dam was lowest would see the fastest flowing water and would erode the most quickly. As a chasm was cut down through the dam, the water would flow faster and faster through it - if the height of the water above the bottom of the cut is 1 foot, only the top foot of water is "pushing" the water over; if the cut is 10 feet deep, now the top 10 feet of water is "pushing" water out the hole, and it will be pushed much faster.

This will turn into a feedback loop, and the water will rapidly cut a channel through to the bottom of the dam. As water flows through it, faster and faster, it will weaken the sides of the chasm, and more of the dam will collapse. This will let more of the water flow through, but slower, eventually limiting further erosion.

This is essentially the same thing happening in the spillways right now, but slower, because the spillways are designed to resist erosion. Once the vicious cycle gets started, though, the end result is the same.

The dam is 922', but the emergency spillway is 901 feet and very, very wide. It's designed to make the dam impossible to overtop. Even the worst case scenario this year would not lead to overtopping the dam.

The question is about why water flowing over the dam would damage it. Clearly they have engineered the spillway to prevent that, so there must be a reason, which is why the person asked.

If someone asks you what would happen to the people inside of a submarine if they let it fill with water, you don't respond by saying they design submarines to prevent them from filling up with water.


3. The emergency spillway is 21 feet lower than the dam crest and very, very wide. It should be impossible for the main dam to overtop from water inflow.

But there was some concern during design about tsunami. It's important not to overtop even in a brief event like that.

Kind of a fun study, enabled by the presence of so much public-domain data from CA DWR.

One piece that's missing from this analysis is upstream precipitation. You are sensing precip at Oroville itself, but water in the dam comes from everywhere upstream (3600 square miles according to [1]). As well as (potentially) snowmelt.

People doing forecasting of reservoir levels will be integrating distributed precipitation information, snowpack, and temperature, with a soil runoff and routing model to get predictions.

[1]: http://cdec.water.ca.gov/cgi-progs/profile?s=ORO&type=dam

Indeed. If we want to leave the analysis to professionals we would look at the CDEC river guidance for the Feather above the dam. Here it is.


As you see the forecast flow is only about one third of that experienced on Friday past.

Thanks for locating that. I work with people who do retrospective runoff and streamflow modeling (using past measurements to make better models), but not the predictive stuff in the graphs you linked.

NOAA has nationwide river sensor and guidance data if you are interested in a river near you.


This is terrific. Would be a great resource for journalists to identify and understand some of the key points that can then be shared in a more public-friendly format. I wish all stories had this kind of accessible data “behind the news” that could be explored!

Here's a great article on climate change's effects on our water infrastructure in light of the Oroville Dam's situation: https://www.nytimes.com/2017/02/14/opinion/what-californias-...

An example - the bulk of California's water storage is in the natural reservoir of mountain snowpack. The large volume of precipitation this year, much of it as high-altitude rain falling on snow, has caused an extraordinary amount of snowmelt in a short period of time. The natural reservoir of snowpack is rapidly released into our manmade resorvoirs and strains or exceeds their capacity. This cycle of alternating extremes, drought to wet, is expected to continue, and our water infrastructure's capabilities must be planned in light of that.

This post [1] from that thread is the simplest presentation of the current and expected scenarios than anything I've seen elsewhere, including the OP. If they are able to run the main spillway at the current rate of 100,000 cfs without concern for further major erosion, the outflow outpaces anything but the highest inflows from the recent storms.

[1] https://www.metabunk.org/oroville-dam-spillway-failure.t8381...

Those are the most informative pictures of the situation I've seen yet.

Is welding giant metal sheets over the hole an option?

No. You would need to anchor the metal plates to something, which means you'd need to drive spikes down into bedrock and then weld things to the spikes.

Would it be feasible to dig out another minor temporary overspill or diversion out of the basin into another one somewhere far away from the dam, and more importantly the emergency spillway?

Dry Creek basin seems to have some points were it's just a few meters of a hump over the emergency spillway level.




It's unnecessary. The current emergency spillway is already on the other side of a hill made out of bedrock from the dam itself.

All of the concern about the 'failure' is a matter of loss of control of water release, not of breach of the main dam structure.

I thought the main danger, and reason for evacuation, was a potential uncontrolled failure, due to headward erosion, of the emergency spillway dam structure. (Thats the structure between the regular spillway and that parking lot)


Yes -- which would only result in the loss of control of the amount of water above the level of the bottom of the spillway, or the bedrock under the spillway. That's maybe the top 30 feet at the most.

While that's still a lot of acre feet, and it would create a 'wall of water' situation in the first surge (hence the evacuation), it's not as significant as the dam failing and the entire reservoir emptying through it.

I'm sure this is a stupid question, but would it be possible to build essentially a large "siphon", so that the discharge of the water didn't result in erosion directly below the spillway?

Seems like it would be pissing in the ocean (or I guess that analogy in reverse?), but when I saw the newsreels of the helicopters dropping bags of rocks into the spillway hole, that also seemed like it would take a couple thousand of those drops to make a difference.

They're letting 100,000 Cubic feet per second flow over the spillway. Niagra falls has an average flow rate of 85,000 feet per second.

I don't think you're going to be able to make a siphon quickly that is large and robust enough to handle a significant fraction of that.

You're basically talking about building another emergency spillway. We could build one, sure, but why would it be any better than the existing one? It would still have the same risks of e.g. rapid erosion (making a concrete one isn't going to be any quicker than repairing the main spillway, I would think), and whatever was at the bottom of it would probably be less prepared to receive a surge than the river is.

This is a great analysis.

However, this seems a little off: "Based off of our estimate, for every inch of rainfall at the Oroville dam, 136,790.5 acre-feet will be added to the reservoir." The relationship probably isn't linear. Much less of the first inch of rain makes it to the reservoir vs. the 5th inch of rain.

Hopefully, the next iteration will project how many inches of rain it will take this weekend to top the spillway again.

They have models for run off. In my previous career we used tr55 analysis of watersheds, this included surface types, slopes and areas. You model for various storms once the model is built. We didn't have to deal with mountain snow packs which add another factor.

Edit: I think newer programs to model exist now. Some are here https://www.nrcs.usda.gov/wps/portal/nrcs/detailfull/nationa...

165 page PDF describes it. https://www.hydrocad.net/pdf/TR-55%20Manual.pdf

It is quite a bit off. The Upper Butte Creek Watershed, which is the watershed that drains into Lake Oroville, is 3200 square miles. Much of this watershed is mountainous. The amount of water that will flow into Lake Oroville is heavily dependent on temperature (what forms snow vs. what melts snow), the existing snowpack, the ground saturation state, and precisely where in that 3200 square miles the rain falls.

> Much less of the first inch of rain makes it to the reservoir vs. the 5th inch of rain.

Depends on whether the ground is already saturated.

Even if the ground is saturated, plants still will absorb more of the first inch than the 5th.

I don't think that's right. If the ground is saturated, the plants won't absorb any of it, because it won't sink into the ground at all, because the ground is saturated.

(That is, unless you're talking about plants absorbing water from their leaves directly, before the water hits the ground. I don't know that plants do that when their roots already have plenty of water, but I also don't know that they don't.)

Several pilots have published videos of overflying the Oroville Dam. Here's one from a guy who makes some good videos:


The money shot is at 2:25

That's a lot of boats up just past the dam. I'm surprised their owners aren't hastily trying to get them out of the water. If the dam fails in a major way, god only knows what will happen to the boats...

No one is worried about the dam failing. The emergency spillway is the problem. It's the hillside you see in the video with water flowing down it, to the left of the main spillway.

If the emergency spillway failed, it could lower the water level in the reservoir by 30 feet. That would be catastrophic downstream, of course, but probably wouldn't have much effect on the boats.

> No one is worried about the dam failing

My wife, a Water Resources Engineer, is. If the damage to the spillway starts to move sideways, or uphill too far and too deep, you could see the main dam get damaged. It's not as likely as the emergency spillway dropping things 30 feet, as you say, but it's still on the table as a worst-case scenario here.

Oh my. I stand corrected.

Well, let's hope for everyone's sake that this worst-case scenario does not happen!

Insurance claims.

Might insurance exclude this type of event?

If they deem it an "act of God" (whatever that means), it could be excluded.

Thanks for putting this together. Keep me in your thoughts, guys. There's a lot of misinformation about what's going on up there. I'm in Yuba City, 2 miles away from the feather and 30 ft up.

They reduced the mandatory evacuation order but more and more emergency personnel are arriving. There are (unfounded at the moment) local reports of water seeping underneath the emergency spillway weir.

As always, the quirky units Americans insist on using never cease to amaze.

Acre-feet. Of course.

The only thing missing here is nautical miles and football fields.

Football fields are area though, not volume. Maybe washington-monument-football-fields for an easy and relatable unit?

Then to make it more authentic, have other things like Liberty Statue College Football Fields and Washington Monument NFL Fields.

Oroville dam feeds into the Feather river, the Folsom dam feeds into the American river. Here is the kicker: both the American and Feather feed the Sacramento river - a major bottle neck.

Folsom dam has turned on the spill at insane rates the past few days. I wish people living in Sacramento county (my parents) would take things more seriously.


Sacramento will inevitably be destroyed without warning when an earthquake strikes Folsom. Folsom dam is basically a trade where Sacramento gains temporary flood relief in exchange for a certainty of being wiped out by the instrument of that relief. There is a strong economic argument that this makes good sense.

Well first of all Folsom has little to no seismic activity nor does Sacramento. If an earthquake hit Folsom hard enough to impact the dam Im pretty sure the bay area would be eviscerated. But if the dam was to falter, the water would actually just go into Lake Natoma and sit behind Nimbus dam rather than turning Sacramento into a lake. Even if both dams failed fully or partially, Sacramento levies would experience an increase in flow and volume. You could presumably have a breach in the levies in one or more places but one thing to know is that much of the levies that surround downtown Sacramento are rated at 500 year partly due to the capitol being in Sacramento.

There are no major faults in that area, however. Folsom is on very seismically-stable ground.

I realise this is done to show off the ATSD tool, but it would also make a really good example using a Jupyter Notebook. GitHub will process and render those itself.

Example: https://github.com/benlaken/Comment_BadruddinAslam2014/blob/...

It's great that you're building on this dataset but I'm not sure you're well-placed to pronounce on it being a disaster. Database expertise doesn't make you an expert on civil engineering.

I really enjoy the graphs and analysis produced by this project, but does it irk anyone else that the Y-axis does not start at 0 in all the graphs? There are some graphs further down the page that start and 0 and they seem to show the data so much better (Water is increasing, but not 50x or whatever).

What ever happened to the Mosul dam? Stories from about a year ago made it sound like collapse was imminent, but it hasn't happened yet. Did they do repairs, or are they just unsure when it will happen?

The New Yorker did a Reporter at Large on it last month: http://www.newyorker.com/magazine/2017/01/02/a-bigger-proble...

Sounds like many are still worried about it, but there's really no consensus.

It looks like someone had spend quite some time doing this on working hours. It is to showcase something ("where is the ad?"), or someone asked and paid for this analysis?

The ad is in this sentence: "Additionally, this article illustrates how publicly available data from the California DWR can be easily loaded into the non-relational Axibase Time Series Database (ATSD) for interactive analysis with graphical representation of open data published by government organizations."

The ad is also in the URL:

https://github.com/axibase/atsd-use-cases ...

Still, its a TSDB that I wasn't aware of, so it worked on me!

Actually a really practical way of showcasing a product without markitecture brochureware. A GitHub repo and links to Docker images.

This is a very educated estimate. Let's hope it rains less than 2 inches per day.

Will a lot of infrastructure survive the new weather patterns?

No, we've done a very poor job of even keeping up with maintenance in this country. Plus we've push the boundaries of building where we shouldn't. It won't end well for a lot of people.

"Disaster" ? not yet (and I hope not ever)

Please tell me Betteridge's Law of Headlines does not apply here...

Does Betteridge's Law of Headlines Apply Here?

No. At least not strictly speaking.

The dam is in no danger. The spillways are the problem. The main spillway is badly damaged, and the emergency spillway was in some danger of collapse on Sunday.

The main spillway is a long straight concrete chute with a gate at the top to regulate the flow. If the erosion in this spillway worked its way up to the top, it could damage that gate and operators would lose the ability to control the outflow.

The emergency spillway is a low concrete wall with a hillside below it. Its top is lower than the dam top, so water will go over this wall (and did on Sunday) instead overtopping the dam itself. The worry here is that erosion of this hillside could cause the wall to collapse.

Both spillways are separated from the dam by a large hill. Even if they failed, the dam would survive. So there wouldn't be a flood of all 700 vertical feet of the reservoir, but there could be a flood of 30 feet from the emergency spillway.

Of course if you were in the path of that flood, the distinction between the "dam" and the "spillway" would be small consolation.

The Metabunk page that seanp2k2 linked has some good information and discussion:


And the YouTube video that clamprecht linked has a good view of the dam and spillways on Sunday:


What is the point of all this data and analytics if the author does not even speculate on an answer to the question that is being asked?

I never considered it, but maybe climate change will help clear some of our waterways of harmful dams.

Bad news, significant flood control+reservoir dams will certainly be rebuilt.

So if you are rooting for big dams to be destroyed by weather, you are just rooting for short term harm and suffering, there won't be any long term changes to the river systems.

Small dams are great. Per dollar they can store a lot more water than large dams (partly due to compacted earth vs concrete construction), they don't incur large water transport costs and evaporation/salination, and they reduce regional flooding (again, because they store more water per dollar) and improve on-farm water security. In terms of liters stored per hectare, there are more opportunities for small-scale dams than large dams. Like in Australia, every farm should have its own system of dams.

Scale matters. Like all things, there's a "right size" for everything. We've been focusing almost exclusively on huge dams, but neglecting small [property scale] dams. At a smaller scale there are many advantages: significantly more 'edge' for beneficial biotic interactions (edges between biomes are more productive than either biome on its own), the topsoil can be economically removed during dam construction to avoid methane release and wasted biomass (impossible in a huge scale dam), runoff and on-farm erosion is reduced by strategic dam placement within the landscape, and the water can be directly used on the farm with no transaction overhead or canal network (just a simple system of gravity fed pipes).


>significant flood control+reservoir dams will certainly be rebuilt

I wouldn't necessarily adopt such a pessimistic attitude. Your analysis is predicated on the idea that

A) the society will have the resources to rebuild thousands of [often economically futile] dams, and

B) the society will neglect to account for the ecological and indirect economic costs of building a dam (reduced forestry yield due to phosphorus deprivation from salmon runs, increased insurance cost due to delta erosion, etc) versus the water resources and flood control abilities, which are constantly degrading due to silt buildup.


Depends how fast the changes come, and how they compound.

This certainty is unwarranted, in the recent political context.

Being utterly indifferent to thousands of lives being massively disrupted is unwarranted in pretty much any context.

I'm not indifferent. I just made an observation. Such changes would come with pros and cons. Many lives would be saved by the removal of dams. Perhaps you are indifferent to the lives that were disrupted by the creation of the dams?

That's pretty fuckin' rich. With the possible exceptions of wars and nuclear-plant meltdowns, nothing disrupts lives on the scale of large dams. I didn't vote for the Orange One, but this is why he won.

If it helps, I'm not coastal or elite.

It's certainly a fair criticism of my phrasing, but having your home washed away due to weather is quite different than being compensated to uproot.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact