[dupe] A Zero-Math Introduction to Markov Chain Monte Carlo Methods 38 points by tosh 3 months ago | hide | past | web | favorite | 8 comments

 Previous discussion: https://news.ycombinator.com/item?id=15986687
 Random-play Monte-Carlo was the first algorithm that lead to good computer Go software, before neural network. It was around 2008 I think. Before that, pattern-base algos were really, really bad (like, barely above human beginner level).I'm not a mathematician, but the paper itself was a real beauty. I remember vividly the parameter that balanced "exploitation" of apparently-good paths, and "exploration" of unknown/apparently-bad path. I used it in many analogies discussing innovation programs within large companies.
 Is this [1] the paper you are referring to? Thanks for the heads up, this work looks interesting.
 Are there any Lot-Of-Math Introduction to Monte Carlo Methods?
 I'm not sure, since it doesn't introduce the notion of detailed balance, whether this article really deals meaningfully with the use of Markov chains at all.It doesn't bring out the fact that the Markov chain transition probabilities have to be tuned to explore the parameter space. The relative efficiency of MCMC versus a naive random sampling approach depends on this leveraging of detailed balance so that the correlations of the Markov chain work in favour of the experiment.So given that the article introduces this notion of a random walk, so it seems like it's going to discuss the Metropolis algorithm, it's not great that it ducks the main issue which is why a correlated Markov chain random walk is a useful approach.The key is that it's a "conditioned" random walk, and the method by which it is conditioned is the real trick to MCMC (at least to Metropolis, which is the cool kind.)
 [flagged]
 Out of interest, have you read the article? If so, do you have specific objections to its content? Did you find the article uninformative? Wrong?Personally, I upvoted it because I think it gives an excellent overview of the topic. I'd be interested to hear your specific objections.
 I don’t feel too strongly, but I can sympathize with criticisms after reading the article.Basically, there are a couple of 1-sentence summaries about what Monte Carlo simulation is and what a Markov Chain is, and those are fine. But then there are some hokey graphs that absolutely do require you to already come to the article with an understanding of e.g. likelihood functions, priors, sequential random sampling, and approximating a distribution — which basically means the article is for people already with a reasonable exposure to math for probability if they want some soft verbal introductions to MCMC on top of their existing experience.It’s certainly not useful for someone who would not have a strong internal mental model of the concept of probability distributions, for example.I think it’s why the parent comment expresses frustration at “advanced math for people who don’t know intermediate math” type of article.Basically the OP article here would be fine as the 1-2 page introductory note of a chapter in an intermediate math textbook that was prepping a math student for a deeper dive into e.g. the Metropolis algorithm, Gibbs sampling, etc.But it seems misplaced to even be trying to give a “pop” introduction to something like that. Inevitably no matter how much hand-waving or illustration there is, it falls back on appeals to other math topics you’re expected to know, and so the “zero math” promise is neither something anyone needs nor something the author of such an article can deliver. Thus, that part feels a bit like disingenuous marketing or click-bait tactics.
 Are you objecting to the article because you feel the title is misleading (i.e. it actually does contain math), or because you don't think MCMC can be explained without math?I actually felt it was a halfway decent description of MCMC, but I could see criticisms from both perspectives above.

Applications are open for YC Winter 2019

Search: