
12 Atomic Experiments in Deep Learning - abidlabs
https://abidlabs.github.io/Atomic-Experiments/
======
abidlabs
Deep learning remains somewhat of a mysterious art even for frequent
practitioners, because we usually run complex experiments on large datasets,
which obscures basic relationships between dataset, hyperparameters, and
performance.

The goal of this notebook is to provide some basic intuition of deep neural
networks by running very simple experiments on small datasets that help
understand trends that occur generally on larger datasets. The experiments are
designed to be "atomic" in that they try to test one fundamental aspect of
deep learning in a controlled way. Furthermore, the experiments do not require
specialized hardware (they all run in a few minutes without GPUs. See the
elapsed time, measured on a CPU-only machine).

------
alfozan
Very interesting experiments. I have been thinking for a long time that it's a
good idea to do these atomic tests. About test 8, it is actually well known
that if you want accuracy vs # of epochs, decrease the batch size -- however
it causes time inefficiency. And I found number 7 quite surprising!

