Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
Show HN: AdaBound, an optimizer that trains as fast as Adam and as good as SGD
(
github.com/luolc
)
60 points
by
Luolc
on Feb 27, 2019
|
hide
|
past
|
favorite
|
6 comments
FrankDixon
on Feb 27, 2019
|
next
[–]
Are algos like Adam being used besides deep learning? I always come across Levenberg-Marquardt and Gauss-Newton, but never Adam.
harias
on Feb 27, 2019
|
prev
|
next
[–]
Reddit thread by author :
https://www.reddit.com/r/MachineLearning/comments/auvj3q/r_a...
1024core
on Feb 27, 2019
|
prev
|
next
[–]
Is there a nice resource where I can read about all these different optimizers, their details, as well as a comparison between them?
blackstache
on Feb 27, 2019
|
parent
|
next
[–]
http://ruder.io/optimizing-gradient-descent/
1024core
on Feb 27, 2019
|
root
|
parent
|
next
[–]
Sebastian does not disappoint! Thanks!
parkjon
on Feb 27, 2019
|
prev
[–]
Looks like a nice incremental improvement. Still waiting on TensorFlow though.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: