
Hyperopt: Optimize your machine learning parameters automagically - chewxy
https://github.com/hyperopt/hyperopt
======
ajtulloch
Practical Bayesian Optimisation of Machine Learning Hyperparameters [1] was a
great recent paper on this topic that looked to have very nice results in
practice across a range of models. Is there any interest in implementing that
algorithm in hyperopt?

[1]: [http://arxiv.org/pdf/1206.2944.pdf](http://arxiv.org/pdf/1206.2944.pdf)

~~~
davmre
From the documentation: "Hyperopt has been designed to accommodate Bayesian
optimization algorithms based on Gaussian processes and regression trees, but
these are not currently implemented." [1] So it sounds like they've at least
thought about it.

[1]:
[http://hyperopt.github.io/hyperopt/](http://hyperopt.github.io/hyperopt/)

~~~
thisisdave
For those that don't know: a Bayesian optimization approach based on Gaussian
processes has been implemented in Python by the author of that paper [0].

Here's a nice blog post with some information on some problems it can run into
in practice [1].

[0]:
[https://github.com/JasperSnoek/spearmint](https://github.com/JasperSnoek/spearmint)
[1]: [http://fastml.com/tuning-hyperparams-automatically-with-
spea...](http://fastml.com/tuning-hyperparams-automatically-with-spearmint/)

------
nullc
Because your 400 gazillion degrees of freedom machine learning core was not
already ill conditioned enough for you?

~~~
jaberg99
The purpose of a hyperparameter opt algo is definitely to have fewer than the
thing it is configuring, and definitely for it to be faster to get a good
model by just running the default search policy than by trying to rig up your
own meta-algorithm on top of it.

------
yetanotherphd
There seem to be a lot of hyperparameter optimization software packages,
presumably implementing different algorithms.

Is there some software that will run all of them, and pick the one that gives
the best results?

~~~
chewxy
So you want a optimizer for hyperparameter optimizers? This is getting quite
meta.

------
elyase
It would be nice if something like the Efficient Global Optimization Algorithm
were implemented. In my opinion this would be the "automagical" way of
optimizing hyper parameters.

[1] [http://www.ressources-
actuarielles.net/EXT/ISFA/1226.nsf/0/f...](http://www.ressources-
actuarielles.net/EXT/ISFA/1226.nsf/0/f84f7ac703bf5862c12576d8002f5259/$FILE/Jones98.pdf)

~~~
jaberg99
I filed this as a hyperopt ticket [1]

My hunch is that this algorithm, like GP-based strategies will excel in small
numbers of dimensions, but struggle in high numbers of dimensions, especially
with conditional parameters. But it's a classic algorithm, and a great fit for
some problems, so it would be a nice addition to the lib.

[1]
[https://github.com/hyperopt/hyperopt/issues/177](https://github.com/hyperopt/hyperopt/issues/177)

------
jaberg99
Thanks for the HN post @chewxy. I just wanted to mention a bit of progress
status. The version on pypi has been a workhorse.

We're currently working on a not-so-minor update that will include a major
upgrade in code quality, better documentation, probably include two new
optimization algorithms, and come with a wrapper around scikit-learn. All that
should be happening by February.

Sign up for [https://groups.google.com/forum/#!forum/hyperopt-
announce](https://groups.google.com/forum/#!forum/hyperopt-announce) for
project updates.

------
dimatura
This looks nice, though I feel like this is one of those cases where it's
easier to code it yourself than to learn and use the library API. At least
until the BOA stuff is implemented, and perhaps even then.

------
chewxy
An introduction is here:
[http://www.youtube.com/watch?v=Mp1xnPfE4PY](http://www.youtube.com/watch?v=Mp1xnPfE4PY)

