Thanks for that. I thought about giving lisp a try, lately. This tutorial/presentation has been very insightful and I will try out kons-9 to pick up some lisp.
Hold your horses. I think the title overstates the impact of those old wheat varieties. Yes, genetic diversity is high. But, the reason that today’s wheat is less diverse is only the side effect of one of the most astonishing feasts ever pulled off by a single human: Norman Borlaug. His breeding program in the 1940s was the start of the “Green Revolution“. He’s the guy who saved the billions from starvation.
For further reading about his story, I can highly recommend „The Wizard and the Prophet“ by Charles C. Mann [2].
Also made this exact association. My takeaway: when estimating unknown quantities in software development you can treat means as infinite thus making any project unestimatable (management hates this reasoning - a friend told me, pinky promise); otherwise assign some non-zero probability than any mean will be 10-100x of the estimate.
Current world-wide oil production is c. 83 million barrels per day [1]. With the estimated 511 billion barrels, about 2x of Saudi Arabia’s reserves (2005, [2]), that would last for about 15 years.
The graph named “non-linear growth”, is actually showing linear growth. I know, it’s confusing, but as long as the factor is constant (10), growth is linear.
A quick way to check if something grows linearly is to put it on a log-scale and to see whether it’s a straight line.
Nice explanation, though. We should talk about logs more often.
I think you and the author are using terminology differently. That graph is absolutely a logarithmic axis to me. The five ticks on the axis are equidistant but each represents a number 10x of the previous. That's nonlinear to me. My definition of linear growth is that it is bounded above by a linear function. Its first derivative would therefore be bounded above by a constant.
If something is a straight line when you plot it in log scale, you are plotting exponential growth.
Good point. Also "linear convergence" means the residual reduces "linearly", or |r_{k+1}| = \lambda |r_k| with \lambda \in (0, 1). So it is somehow exponentially converging, and an algorithm with linear convergence is neat and fast.
Reminds me of some of the remarks Joe Armstrong made a while ago [1], and which I came across via another submission a couple of weeks ago (which escapes me). It’s a great talk about the physical limits of computers and computation.
Mathematics is beautiful!
[0] https://www.mrob.com/pub/comp/xmorphia/ogl/index.html
reply