Very true. The twisted irony is that the long term risks of academia are devastating. We have a lot more short term job volatility in software, but unless we work for some outlier asshole who torches our reputations, we can usually get new jobs. Academics almost never get fired except at legible points (e.g. tenure review) but when they do, it ruins their lives in a way that most of us have never experienced.
The only thing controversial about this article should be whether it applies to quantum phenomena. The author touches on Bell's theorem etc., but that might distract from the main part of the argument: chance doesn't cause anything, at least on an ordinary human scale. Chance is the catch-all word we use for causes we don't understand.
Probability doesn't describe reality but rather our understanding of reality. If you flip a coin and look at it without showing me, its state is certain for you and random for me because you know something I don't know. If I assign it a 50-50 chance of being heads, I'm making a statement about my uncertainty regarding the coin, not the coin.
I've seen leaders who don't delegate but abdicate. They hand over a project completely, refusing to make decisions that only they are authorized to make. It's as if they're saying "Guess! And if it turns out poorly, I can say I never told you to do whichever course you take."
I've always found M-D-Y apt for many use-cases, because I often know exactly which year is being referred to - I just need to know the month and day. With M-D-Y, that information is immediately presented. It seems like a decent compromise.
My favorite format is YYYY-MMM-DD. That is, year, 3-letter (english) month, day (i.e. 2013-Nov-29). Only works for people that know the english names for dates, but eliminates ambiguity about which is the day and which is the month.
In my opinion, Python is good enough at a lot of different tasks. I don't think is the best possible language for everything, but it's close enough that it's better than having to keep more languages in your head.
I do a mix of math programming and general programming and I'd like to stay in one language if feasible. I would far rather use a well-designed, general-purpose language with math libraries than try to use a mathematical language for general-purpose programming.
I don't think that human nature changes over history, but culture does. I agree that attention spans have become shorter, but I don't see how that relates to whether programming languages will expand their palette of symbols.