Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

2000 years for events like this to make a “record” significant. Particular when records are of the sort like “11th consecutive day with at least 8”.

Ever listen to a baseball game with an announcer who loves stats? They can find something about nearly every game that breaks a record of some sort.



It's true, that is one of the things about lies, damned lies, and statistics - the more creative you are, the easier it is to invent new record breaking stats.

2000 years seems somewhat arbitrary - would you apply the same threshold to "500 tornadoes within a 30 day span", what about 1,000, what about 10,000?

There probably ought to a threshold where extraordinary evidence is sufficient to make extraordinary claims, otherwise if your requirement is 2000 years, I'm sorry to report that nothing could ever persuade you about climate science, or really most science.


I was thinking more about it, feeling a bit bad about my somewhat glib comment, and I came back to add, I think as long as you can come up with equally impressive sounding statistics pushing the other side of the narrative, it’s no use.

But to your point, annual statistics showing multiple sigma deviation for several years out of a 10 year period, that to me would tells us something has changed.

Then it occurred to me, I don’t know anything about number of tornadoes per year, how much variance is in the data, and what’s the trend like, but a quick Google search brought me here;

https://www.ustornadoes.com/2012/04/10/violent-f4ef-4-and-f5...

And it sure looks like the “Violent US Tornadoes by Year” has massive variance and little to no discernible trend line.

What I learned is that almost all violent tornadoes happen in April and May and some years have huge clusters and some years have very few.

This kind of phenomenon requires a very long historical series coupled with a long stretch of deviation from the norm to be able to say anything meaningful about recent data, other than you are most likely seeing new random output from the same underlying/unchanged function.

It short, I hate news stories about the weather. They need to make something inherent about the world seem new.

EDIT: Here’s another good one from the same site;

“I decided to take a look at the number of tornadoes that have been reported in the Plains between May 16-31 over the past 25 years, from 1993 to 2017. ... The most was 204 in 2004, while the least was just nine in both 2006 and 2009.”

In that ~two week period (which coincides with the current burst of activity) we saw some years 9, one year 204, in the last 25 years.

What’s so discernible about this year in that context?

[1] - https://www.ustornadoes.com/2019/05/14/how-peak-tornado-seas...


Careful, that sort of logical thinking might inspire questions about the whole tottering edifice...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: