
Show HN: AdaBound, an optimizer that trains as fast as Adam and as good as SGD - Luolc
https://github.com/Luolc/AdaBound
======
FrankDixon
Are algos like Adam being used besides deep learning? I always come across
Levenberg-Marquardt and Gauss-Newton, but never Adam.

------
harias
Reddit thread by author :
[https://www.reddit.com/r/MachineLearning/comments/auvj3q/r_a...](https://www.reddit.com/r/MachineLearning/comments/auvj3q/r_adabound_an_optimizer_that_trains_as_fast_as/)

------
1024core
Is there a nice resource where I can read about all these different
optimizers, their details, as well as a comparison between them?

~~~
blackstache
[http://ruder.io/optimizing-gradient-descent/](http://ruder.io/optimizing-
gradient-descent/)

~~~
1024core
Sebastian does not disappoint! Thanks!

------
parkjon
Looks like a nice incremental improvement. Still waiting on TensorFlow though.

