
Linear Optimization with Glop - mzl
https://developers.google.com/optimization/docs/lp/glop
======
gok
Super cool demo.

Linear programming in C++ with a Python wrapper has been implemented by many
other packages too. Does anyone know how Glop compares with scipy.optimize,
PuLP, pl_solve, COIN-OR's many projects, or CPLEX and its many Python
wrappers?

~~~
hartror
Generally the wrapper isn't the bit you're worried about, rather the speed and
abilities of the underlying solver implementation is the factor you care about
most. We use PuLP for our projects as it is compatible with most LP/MIP
solvers giving us the ability to code the formulation once then run on a
variety of solvers.

I would expect Glop support will be added to PuLP given some time.

------
hartror
They've put together a blog post with a bunch of features, including the fact
you can solve LPs in google doc spreadsheets
[http://googledevelopers.blogspot.com.au/2014/09/sudoku-
linea...](http://googledevelopers.blogspot.com.au/2014/09/sudoku-linear-
optimization-and-ten-cent.html)

------
repsilat
Any idea why Google made their own library instead of working on (or forking)
another library like GLPK or CLP? And is there any benchmarking data? I
couldn't see any on the site.

If I had to guess I'd say this is meant for Google Docs to fill in one of the
Excel feature gaps. It probably focuses more on stability/accuracy than speed,
and probably won't provide performance comparable to existing packages. Might
have grown out of someone's hobby project or a university assignment.

~~~
mzl
I'm not an expert on LP-packages, but in my small experience most LP packages
are not that great in areas such as API design, modularity, and other similar
non-mathematical areas that are important to programmers.

My guess is that they wanted something small, simple, and easy to build into
other products. I've heard from some of the or_tools developers that the
internal usages are most often for rather simple problems. In such cases, ease
of use is quite important.

That being said, or_tools is a very competent optimization package. For
example see the results in the latest MiniZinc Challenge:
[http://www.minizinc.org/challenge2014/results2014.html](http://www.minizinc.org/challenge2014/results2014.html).

------
mzl
Personally, I'm hoping that this is a first step towards Google implementing
their own MIP solver.

I think that given the general quality of or_tools, that might be a very
practical choice when the full power of something like CPLEX or Gurobi is not
needed.

------
superflit
I really want to know what other fellow HNers are using LP for. If you may
tell real cases that will be awesome. Mostly I am using it for product mixing
and licensing cost optimization..

~~~
dxbydt
I'm using LP in production as a solver for
LAD([http://en.wikipedia.org/wiki/Least_absolute_deviations](http://en.wikipedia.org/wiki/Least_absolute_deviations)).
We've thousands of jobs running on Mesos ( see
[https://news.ycombinator.com/item?id=8394718](https://news.ycombinator.com/item?id=8394718)
) and I forecast & provision their CPU, Memory consumption over time. So
essentially, fit an LAD through thousands of historical time series & predict
what happens to the usage in the future, so we can optimally provision
clusters. My LP code is essentially this
[http://bit.ly/1vxknUi](http://bit.ly/1vxknUi) , with performance
optimizations. So it uses the OSS Apache math libs to do LP, not Glop. But I'd
imagine we'd get the same results with Glop, maybe much faster. Google GSON
for example is a very fast json parser for huge jsons, which we also use in
the same project :)

~~~
superflit
Used for a similar project where I have to mix providing server resources and
limiting the bandwidth. (game servers at EC2)

