Hacker News new | comments | show | ask | jobs | submit login
Google Optimization Tools (developers.google.com)
247 points by danso on July 7, 2015 | hide | past | web | favorite | 40 comments

Coursera has a great course on discrete optimization [1] where I have learned and used or-tools. They are rather nice, but the documentation is half done (basically code is the best documentation) and some interfaces are not compatible with others. I ended up forking or-tools for my own use and tweaking many unexposed internals. I guess it's extremely difficult to implement generic optimization solver, so I won't complain, but be prepared it's not out-of-the-box thing (I doubt there is any).

[1] https://www.coursera.org/course/optimization

> the documentation is half done

The literature and documentation in operations research optimization is enormous, going back to the 1950s. The best of that literature is quite well written.

For combinatorial optimization, there is

George L. Nemhauser and Laurence A. Wolsey, 'Integer and Combinatorial Optimization', ISBN 0-471-35943-2, John Wiley & Sons, Inc., New York, 1999.

The details of optimization are not always trivial, and several fairly challenging graduate applied math courses, complete with non-trivial theorems and proofs, can be needed for much depth in the subject.

I've had four such courses and taught one.

> I guess it's extremely difficult to implement generic optimization solver,

A "solver"? Would be nice to have a "solver". Lots of people talk about giving a problem to a "solver". For linear programming, maybe usually can do that.

Otherwise my experience is that asking for a general purpose solver is, for now, hopeless.

Instead, for the real problems I've had success with, have to look carefully at the problem and exploit special structure particular to that problem.

Typically part of the work involves doing some derivations using the math of optimization.

> Otherwise my experience is that asking for a general purpose solver is, for now, hopeless.

> Instead, for the real problems I've had success with, have to look carefully at the problem and exploit special structure particular to that problem.

I am in complete agreement.

One way I have thought about this is that, rather informally, "the mapping from problems to solutions is highly discontinuous". That is, suppose you have a problem statement, and you figure out a method to attack the problem and solve it in practice. Now, vary the problem statement slightly. There is a reasonable chance that the new problem is dramatically harder to solve, and you need a completely different body of theory to even think about it, let alone solve it.

For example, suppose you have a polynomial equation in a bunch of complex variables. This is easily solved numerically in practice to any degree of precision you care for. Let's reword the problem a little and require that you are only willing to accept integer-valued solutions. Now you're talking about Diophantine equations! Have fun!

In practice, if you're participating in some project to solve business / industrial problems using optimisation, it is pretty common to have the definition of the problem change slightly every few weeks or months, as you learn more about the domain, or as people change their minds, or as the business actually changes what it does. Often you need to figure out clever ways of exploiting the mathematical structure in a particular problem statement to figure out a tractable way of solving the problem. Best to hope that the new version of your problem doesn't suddenly break that structure while you're "sprinting" towards the next release!


IMHO it has been, since the 1950s, mostly this "sprinting" that was in practice just too difficult, that is, too costly in time and money, for far too large a fraction of organizations, budgets, managers, etc. that could have used optimization.

Now with current computing and the Internet, a lot in applied math, in principal powerful and valuable for important real problems, seems to be growing significantly in popularity. The "sprinting" is much easier but, sadly, too often still too challenging.

Of course part of the "sprinting" challenge was solved by, say, A Mathematical Programming Language (AMPL) where can just type in something like

        max z = f(x)
     subject to g(x) >= 0
                  x in C 
where for large positive integers m and n and the real numbers R, x in R^m and g: R^m --> R^n.

So, then, with the problem thusly typed in, just need a solver! Right: Maybe from the back side of one of the moons of Pluto?

Gee, we know how to sort, know a lot about sorting, in O( n log(n) ), etc., so maybe we should have solvers that would do as well on optimization as we can do now on sorting? Then quickly we encounter the monster of the question P versus NP.

So, without an algorithm that shows that P = NP, we're back to column generation here, Lagrangian relaxation there, branch and bound too often, some group theory sometimes, etc. and "sprinting".

Yes, it was interesting philosophically how central and challenging P versus NP is, but I'd rather have just had a way to solve optimization problems routinely, gotten home in time for dinner, and spent Sunday at a BBQ!

Oh I absolutely agree, the algorithms and math are described very well. It's complex but clear, and the course I mentioned explains those concepts very well (and is fun, bonus points for that).

I of course meant the or-tools API docs. Overall I think it's nicely done and structured, but could be better.

I think we're talking about documenting the interface that or-tools provides. Even someone with an excellent understanding of the underlying concepts still needs to know exactly how or-tools exposes those concepts.

Yes. I was not clear: My impression of the OP is that most of the users or intended users will need good documentation on both the applied math and the APIs, details of input files, etc. My post was to say that for the applied math part, good documentation is and long has been available.

A bit tangential, but since you seem to be posting many _good_ math book advice. What would be a good route to read Neveu's probability book? Halmos' Finite Dimensional Vector Spaces and baby Rudin as pre-reqs? Those are within my confort area, but I will need to go through them again.

Or perhaps you recommend some probability books before too Neveu?

I wanted to love that course but found that despite his entertaining style the lecturer could be quite hard to understand in places. What I mean is he'd be giving a description of a problem and I actually wouldn't understand what he was saying. No biggie if you were in a real lecture as you'd just put your hand up but on Coursera you're left with looking at the transcript (which didn't help) or asking in forums (which in case of this course were a ghost town).

Pity as it seemed like an excellent course.

The course is great, and some of the people behind MiniZinc are about to go live with another one (https://www.coursera.org/course/modelingoptimization) about Modeling.

This course looks amazing. Thanks for sharing it

Just wanted to mention that the or-tools constraint solver is absolutely top notch. It has been winning top3 places in the minizinc competition ever since it entered a few years ago, and placed first in 3 out of 4 categories last year. I've used it quite a bit, and with a few exceptions I've found its completeness relative to the Global Constraint Catalog to be excellent, especially so for open source software. Now if only they had a functional 3 dimensional Geost constraint :)

I can also recommend the excellent Gecode (www.Gecode.org). Great documentation (basically a CP book), well designed, and maintained.

If you're interested in optimization with these same back-end solvers, check out the JuliaOpt tools from MIT.


If there are any Googlers reading this, the first link takes you to google code, but the source has moved to GitHub.

Fixed! Though you might have a cached copy in your browser.

You may be interested in a previous discussion of Optimisation / Operations Research in the context of FedEx. graycat's comments were particularly interesting:


Old discussion of glop: https://news.ycombinator.com/item?id=8393200 (with python demo)

While the documentation is a bit sparse (on the web, there are better docs buried in the source code), I found this library to be much easier to work with then the alternatives - which are mostly from academia and have a heavy emphasis on matrix operations. There are still some rough edges, but I was able to get the Python bindings installed and used it to write an optimizer for fantasy basketball[1].

I think the domain of solving problems by defining the constraints is super interesting. In my fantasy basketball example, I define the constraints for a valid roster (simple), then define how to score a roster, and less than a second later, I've got the optimal picks.

One other neat feature of this library is that you can use it directly from Google Sheets - you can read inputs from your spreadsheets, run the optimizer code (javascript) on Google's boxes, and then write output back to your spreadsheet.

[1]: https://github.com/swanson/degenerate

Kinda confused what this helps me do. Can someone explain?

Basically it solves NP-complete problems that are solvable because of small set size and some clever constant optimizations. There are industries where it's a useful thing for scheduling, for example. Or you may want to solve travelling salesman when planning a trip.

Some real-world examples with a little more flavour:

* designing utility networks (e.g. energy / telecommunications) to be cheap / robust

* routing vehicles / ships / parcels to serve all customers cheaply

* dispatching and arranging emergency services vehicles (e.g. ambulances) to best cover anticipated future demand

* managing inventory levels in blood banks. it goes off after a while! there are different types, some are substitutable. which types are better to keep in stock?

* figuring out the most profitable way to chop animals into different cuts of meat

It's for operations research stuff (like logistics, scheduling etc.), not software optimization.

Lots of crazy image processing and computational photography techniques use linear programming and variants on all of these graph algorithms.

Numerical optimization of discrete/combinatorial problems.

Anybody familiar with this know if it has a good implementation of Levenberg–Marquardt algorithm? Or know of one somewhere else? I can't find anything in their docs about non-linear function solving which seems like something google must do a lot of.

Is there a reason why you're looking for Levenberg-Marquardt over a trust-region Newton solver? In general, properly preconditioned matrix-free trust-region Newton solvers will give superior performance to Levenberg-Marquardt. Something like Optizelle


can do that and is BSD licensed.

Thanks! That looks very interesting, will definitely try it out!

Looks like this is more LP oriented.

There are numerous free implementations listed here: https://en.wikipedia.org/wiki/Levenberg%E2%80%93Marquardt_al...

Was there something special you needed?

Ideally something c/c++ and liberal license. I'll take a look at ceres-solver to see how it works. Like that it is using Eigen under the hood to get benefit of SSE instructions.

Are there plans for Go bindings anytime soon?

Nice! Just a question - are there distributed versions of these somewhere as well?

Can these tools be run in the web-browser as well?

The most surprising thing to me is how fast the page was able to load. Not really much faster than any regular web page, but a lot faster than most google pages with the material design. When they first started rolling out the new designs/code on google drive and trends I was lucky if the page even loaded. I'm impressed.

That's the new skin on developers.google.com, glad you like it! I think you'll find the whole thing is pretty snappy, it's rendered 99% on the server side besides a few presentational JS pieces.

My source are all on GitHub.

> If you just want to play Sudoku, fire up Google Sheets and install our Sudoku add-on.

Translated: Hey guys! We might do something that might get shutdown next year but we are still cool!

Logged in just to upvote (y)

The docs and interface look great, and it's nice that each one comes with a sort of mini-walkthrough.

If any Google devs read this, I do have one request: it would be awesome to see runtime analysis based on the theory / design of the provided code, with perhaps some explanation of why certain design decisions were made in the solution.

It's already an excellent educational tool, and this seems like an easy way to make it even better.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact