Enigma was broken by Polish mathemathicians in 1938, then improved by Germans, then broken again, Poles had given all their discoveries to France and to Great Brittain, they also invented first "Bomba" to that worked mechanically.
It's complicated story, and these people deserve credit along with Turing. But till 1970 it was classified information, and to this day everybody talks only about Turing.
Maybe it's my nationalism bias at work, but I'd prefer if the whole story was better known.
Also, if the author wanted to talk about Bayes Theorem and the WWII cryptanalytic effort there's a much nicer example from the Japanese codes: http://blog.jgc.org/2010/07/bayes-bletchley-jn-25-and-modern...
(Of course, being a physics major, I don't mind having to hurt my brain a bit to get through it, so long as it's good.)
To be fair, though, I think he discovered it before probability theory was formalized, and I can see how it may have been non-obvious at the time. Sometimes the best progress makes challenging insights appear obvious in hindsight - nice work, Laplace!
Even in this article, stuff like:
Finally, in 1983 the US Air Force sponsored a review of NASA's estimates of the probability of shuttle failure. NASA's estimate was 1 in 100,000. The contractor used Bayes and estimated the odds of rocket booster failure at 1 in 35. In 1986, Challenger exploded.
makes me uneasy. The rhetoric is too clearly marked.
(Not trolling, I'm honestly curious. My mental association for Bayesians is people who want to tag documents, diagnose illnesses, or similar AI tasks.)
However, for some unknown reason, Bayesian analysis has also become a trendy buzzwords among huge number of crazy internet trolls who seem to think it's a magic formula that can solve all problems, and who have some paranoid delusions that "they" are trying to suppress the knowledge of Bayesian analysis.
The costs of teaching and learning Bayesian Analysis are low (it's just not as hard as, say, the method of moments), and it does have benefits.
My old stats 201 book (Wackerly, Mendenhall and Scheaffer) covers Probability, Discrete Random Variables, Continuous Random Variable, Multivariate Distributions, Functions of RVs, The Central Limit Theorem, Estimation, Properties of Point estimators and Methods of Estimation, Hypothesis Testing, Linear Models / Least Squares, Designing Experiments, Categorical Data, and Nonparametric Statistics. 15 topics (including the introduction), and Bayesian analysis isn't mentioned. Bayes Law is (of course), but only as a theoretical tool, and for solving toy problems about pirates, beads, and rats in the second chapter.
You wouldn't take an engineering analysis book seriously if it didn't mention FEA, but statistics courses can hold their heads up while completely ignoring a useful and easy to teach tool.
Of course, good statisticians and mathematicians will learn about it (later, or on their own), but there's leagues of economists and engineers coming out who will never bother wrapping their heads around it.
Of course, it's entirely possible that it's not so much a conspiracy spearheaded by old-guard frequentists so much as introductory stats courses being focused on teaching a core of theory (LS and MoM), rather then teaching practical tools to people who will use them. You could also accuse introductory math courses of ignoring useful, fun, and easy stuff (scaling?), while focusing on an old, predefined, widely accepted body of theory.
In fact, most (at least in my circle) use frequentist methods as well, as well as methods that don't fall comfortably into either camp. In ML it's particularly common for the same researcher to use any/all of kernel density estimation, Bayesian graphical models, SVMs, etc., depending on the problem.