> Theoretical advances are constant (just check the literature)
* for some arbitrary definition of “constant”
Some people throw the word “constant” around quite freely.
Scientific and mathematical advances are notoriously hard to predict. “Constant” is the last word I would use.
One simple model would be spaced in time according to a memoryless stochastic process while having “impact” scores sampled from an exponential distribution.
Of course, one could include more realisticish dynamics to the model, such as network effects and clumping (e.g. from funding priorities and “ideas in the air”.)
I feel like some areas of computer science tend to correlate with deep statistical appreciation more than others.
Speaking of the topic of stochastic behavior in general, it feels to me like (a) randomized algorithms and (b) better statistical simulations have steadily been gaining traction in algorithms and data structures.
* for some arbitrary definition of “constant”
Some people throw the word “constant” around quite freely.
Scientific and mathematical advances are notoriously hard to predict. “Constant” is the last word I would use.
One simple model would be spaced in time according to a memoryless stochastic process while having “impact” scores sampled from an exponential distribution.
Of course, one could include more realisticish dynamics to the model, such as network effects and clumping (e.g. from funding priorities and “ideas in the air”.)
I feel like some areas of computer science tend to correlate with deep statistical appreciation more than others.
Speaking of the topic of stochastic behavior in general, it feels to me like (a) randomized algorithms and (b) better statistical simulations have steadily been gaining traction in algorithms and data structures.