
Deep Learning Hyperparameter Optimization with Competing Objectives - Zephyr314
https://devblogs.nvidia.com/parallelforall/sigopt-deep-learning-hyperparameter-optimization/
======
Zephyr314
Hi, I'm one of the authors of this post and co-founder of SigOpt (YC W15). I'm
happy to answer any questions about the post or SigOpt.

More info on the research behind SigOpt can be found here [1].

[1]: [https://sigopt.com/research](https://sigopt.com/research)

~~~
stonesixone
Can you provide a tl;dr?

~~~
Zephyr314
Post tl;dr: Sometimes when you are optimizing an ML / AI pipeline you care
about multiple things (speed, accuracy, etc). It is often difficult to make
these tradeoffs without knowing what is possible, but it is expensive to try
different things. Using Bayesian optimization to optimize multiple metrics
simultaneously can help solve this problem more than an order of magnitude
faster than standard techniques like randomized search. We provide examples
tuning TensorFlow and MXnet CNNs with code [1] [2].

SigOpt tl;dr: Parameter optimization-as-a-service. An ensemble of Bayesian
optimization techniques behind a simple REST API. Spend more time building
your models while we optimally tune them exponentially more efficiently than
something like a brute force grid search. Free for academic use [3].

[1]: [https://github.com/sigopt/sigopt-
examples/tree/master/multim...](https://github.com/sigopt/sigopt-
examples/tree/master/multimetric-timeseries)

[2]: [https://github.com/sigopt/sigopt-examples/tree/master/dnn-
tu...](https://github.com/sigopt/sigopt-examples/tree/master/dnn-tuning-
nvidia-mxnet)

[3]: [https://sigopt.com/edu](https://sigopt.com/edu)

------
tartakovsky
Hello, I'm a coauthor of this post and would be happy to answer any questions.

