Hacker News new | past | comments | ask | show | jobs | submit login

Or the normal usage of normal, or the mathematical definition of normal: https://en.m.wikipedia.org/wiki/Normal_distribution



The normal distribution does not define a mathematical meaning of the term "normal". Any distribution can define a normal range relative to that distribution.


Given an expected normal distribution of soil dryness you can compare a given year's distribution.

> Any distribution can define a normal range relative to that distribution.

Any particular sample will always have a corresponding probability in a normal distribution. So what you're saying is kind of right- values always fall within the range of a normal dostribution. That's not what it means to be outside the normal distribution, though. If you have another year that only partly overlaps with the expected distribution, you'll have an area that does not overlap despite being within the same range. That area is the fraction of values outside the normal range.


Precisely speaking "dryness" cannot be normally distributed to begin with as it is a value on a finite interval.


Normal distribution of logarithmic dryness, then.


Sure, as an approximation. But using a more appropriate distribution matching the statistical model would be preferable.


No, it's literally done with a log-normal distribution: https://www.hydrol-earth-syst-sci.net/12/1339/2008/hess-12-1...

Log-normal is extremely common in hydrology. Turns out an anomaly in this case is defined as outside one standard deviation[1], so in a perfectly normal year you would expect ~15.9% of soil to be dryer than normal.

[1]: http://edo.jrc.ec.europa.eu/documents/factsheets/factsheet_s...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: