
Minimize your cloud costs with GLPK and Haskell - yummyfajitas
http://www.chrisstucchio.com/blog/2012/linear_programming.html
======
ozataman
This is great, but the Github literate Haskell code looks so much better
(cleaner, syntax highlighting, etc.) than your blog formatting - I found the
code fragments impossible to read.

The literate Haskell version on Github:

[https://github.com/stucchio/Cloud-capacity-
planning/blob/mas...](https://github.com/stucchio/Cloud-capacity-
planning/blob/master/cloud-planning.lhs)

------
lrem
Is this considered a nice Haskell style for this kind of programs? If so, I
prefer how LP looks in Python. If someone is interested, my presentation
slides:

[http://www-sop.inria.fr/members/Remigiusz.Modrzejewski/prese...](http://www-
sop.inria.fr/members/Remigiusz.Modrzejewski/presentations/python_2012.pdf)

Note this presentation was given to an audience that mostly knew ILP and has
used it before for combinatorial optimization, so I'm not gentle with theory.
It's just to show that Python is a little nicer than Java to do it.

~~~
tome
Your Python DSL looks very nice. There's no reason a Haskell one couldn't look
equally nice.

~~~
dons
Indeed, with type information you can do quite sophisticated embedded DSLs
that heavily reuse the host language's infrastructure for variables, control
and types.

So no need for the LpVariable stuff, for example, and you get type safety.

An example of an SMT solver DSL, which:

* reuses Haskell's type checker to check the types of the embedded language,

* uses Haskell's polymorphism to add polymorphism to the underlying language

* and the host language's variables as target language variables.

[http://code.haskell.org/~dons/docs/yices-painless/Yices-
Pain...](http://code.haskell.org/~dons/docs/yices-painless/Yices-Painless-
Language.html)

------
raverbashing
Very insightful

Linear Programming is one of those things that's usually underrated (and
sometimes overrated)

But I would probably not use Haskell and just create the problem as a text
file in whatever format GLPK accepts - probably a derivative of 'the standart
format' (which is bad for very complex problems, still)

~~~
yummyfajitas
I didn't know glpk takes text input.

I'd write the program anyway, however - the text file would actually be quite
large. This problem has 15 variables and 27 constraints (if I counted
correctly). Your plain text input to glpk would be at least 43 lines long
(counting the objective function), compared to about 30 lines of haskell.

If your usage pattern varied daily (e.g., one pattern for monday, a different
one for tuesday), your input would become even bigger. The only modification
to the haskell code would be:

    
    
        main = do
            printLPSolution (Map.fromList [ ("mon_0", 12.2), ("mon_1", 25.1), ("mon_2", 53.5),
                                            ("tues_0", 15.8), ("tues_1", 33.9), ("tues_2", 76.2),
                                            ...
                                           ])

~~~
raverbashing
Yes

Here's the docs for the language glpk uses

<http://www.cs.unb.ca/~bremner/docs/glpk/gmpl.pdf>

Yeah, with 15 (effective) variables it's better to write in some higher level
way.

And there's always the canonical way, but that's awful to write by hand, like

k00.v00 + K01.v01 + K02.v02 + k03.v03 < C0 k10.v10 + K11.v11 + K12.v12 +
k13.v13 < C1

~~~
DennisP
I used it for a problem recently, and just wrote code to generate the text
file for me.

------
larrydag
If others are interested in Linear Programming or more specifically the
science of Operations Research you can look to Michael Trick's OR Blogroll.
[http://www.google.com/reader/shared/user/1312140850587335682...](http://www.google.com/reader/shared/user/13121408505873356820/label/ORBlogRoll)

------
spitfire
Cool that they're using Haskell for something. But this would be much better
done in Mathematica, Octave, R or Macsyma. It's an operations research
optimization problem and all of those environments have packages for OR.

~~~
yummyfajitas
Why would it be better in one of those languages?

I'd be very surprised if it was as easy to set up the problem in R or Macsyma.
In those languages, you'd probably need to write loops to generate your
constraints (but maybe I'm wrong, I'm no R-expert).

But I'd love to see code proving me wrong.

~~~
spitfire
Well, my experience is in mathematica. Which has operations research functions
built right in. Point being, there are systems where this is already a solve
problem, where you can get your answer in a minute or two.

So you'd simply do Minimize[function, {variables/constraints}] and then get on
with your day.

~~~
yummyfajitas
It's true, glpk is not part of the haskell standard library. But solving a
simple LP problem is easily done in a minute or two:

    
    
        lp = execLPM $ do	setDirection Max
    			setObjective objFun
    			leqTo (varSum ["x1", "x2", "x3"]) 100
    			leqTo (10 *^ var "x1" ^+^ 4 *& "x2" ^+^ 5 *^ var "x3") 600
    			leqTo (linCombination [(2, "x1"), (2, "x2"), (6, "x3")]) 300
    			varGeq "x1" 0
    			varBds "x2" 0 50
    			varGeq "x3" 0
    			setVarKind "x1" IntVar
    			setVarKind "x2" ContVar
    
        main = print =<< glpSolveVars mipDefaults lp
    

The code in the blog post is just longer and more complicated because it's
generating multiple constraints from code.

The only reason my code doesn't look as simple as the above is because it's
doing a lot more. For example, this line generates 9 constraints:

    
    
        reservationConstraints loadPattern = [ linCombination [ (1.0, "reservation_" ++ k),
                                           (-1.0, "reserved_" ++ k ++ "_" ++ p) ], 0.0  |
                                           p <- (Map.keys loadPattern),
                                           k <- reservationTypes ]
    

E.g.:

    
    
        reservation_heavy - reserved_heavy_morning <= 0
        reservation_heavy - reserved_heavy_afternoon <= 0
        reservation_heavy - reserved_heavy_evening <= 0
        reservation_medium - reserved_medium_evening <= 0
        ....
    

If Mathematica is so easy, I'd love to see a significantly simpler version _of
the same problem_ written in it. But I have absolutely no idea how I'd write
that simpler version.

~~~
nukedukem
Maximize[...], Minimize[...], LinearProgramming[...] thats it... But
Mathematica costs money.

Example:

Minimize[{x + 2 y, -5 x + y == 7 && x + y >= 26 && x >= 3 && y >= 4}, {x, y}]
LinearProgramming[{1, 2}, {{-5, 1}, {1, 1}}, {{7, 0}, {26, 1}}, {{3,
Infinity}, {4, Infinity}}]

~~~
yummyfajitas
The vast majority of the code I wrote is setting up the problem (which has 15
vars and 27 constraints), not solving it. I.e., those list comprehensions are
a way of writing 9 constraints on a single line.

To demonstrate mathematica is easier, you need to demonstrate that it's
similarly easy to translate the problem from a verbal specification to an LP
problem. Solving LP is the easy part.

