
AlphaGo under a Magnifying Glass - amelius
http://deeplearningskysthelimit.blogspot.com/2016/04/part-2-alphago-under-magnifying-glass.html
======
cpeterso
The article says AlphaGo has 13-layer neural network. How was the number of
layers chosen? What are the advantages of wider vs deeper layer networks?

~~~
argonaut
I can't speak to AlphaGo specifically, but these parameters (depth and width)
are typically chosen through trial and error. People joke about "graduate
student descent" but it's true. There are empirical rules of thumb that arise
from knowing what has worked in the past. Empirically, more depth is better
and nets should not be very wide at all.

