
Which Simplex Method Do You Like? - luu
http://bob4er.blogspot.com/2015/01/which-simplex-method-do-you-like.html
======
larrydag
When I learned the simplex methond in my Engineering classes I had to perform
it by hand for exams (only a couple steps or to get an updated matrix). The
tableau method was easy to write out and learn. Yet for solution computation
in the practical world it is very computationally expensive. I would solve in
matrix form as the author suggests since linear algebra has come a long way
since then.

Another note is that the simplex method is just one aspect of solving
optimization problems. Most computation solvers now use a series of heuristics
to get a "close enough" solution than perform either a simplex or gradient
descent approach to get a final solution. For those that want to know more
about this research I highly recommend the field of Operations Research.
[https://www.or-exchange.org/](https://www.or-exchange.org/) is a good Q&A
site for detail.

------
raverbashing
Interesting. I don't remember the exact steps of the Simplex method (and to be
fair I remember only one matrix, but yeah, if you want the Dual you solve it
again)

I suspect there are still several methods that are solved in an "old way" and
not optimized with most modern knowledge (in algorithmic and computing
efficiency - like optimizing for cache)

------
1971genocide
I am revising for my exam in two weeks and have been practising the tabulaue
method for a month now ! Not sure if I should read this and and get confused
since I want to make sure to get good grades.

~~~
repsilat
Maybe wait until after your exam :)

The tableau method is in some sense the "bubble sort" of simplex algorithms.
It's simple, it works and it's easy to explain, but it's not something anyone
would use in practice.

That said, if you're not going to actually be implementing these things (and
few people are), it's more important early on to get practice reducing
problems to linear and integer programs than to know exactly how they're being
solved.

~~~
1971genocide
Makes sense !

The tableau method when it was first introduced didn't seem like conventional
linear algebra.

I found it odd though that we learnt machine learning topics BEFORE we learnt
optimization theory. You would think that optimization theory would be taught
before anything in Machine Learning. I found many topics in machine learning
to be really opaque and it seemed it was just "try this model" without having
a deep understanding of the mathematics or the reasoning behind them.

I think its because of all the media hype surrounding ML and my peers thinking
its "cool".

Linear, Integer, and non-linear and dynamic programming really does fill that
gap between conventional linear algebra and many of the topics in ML. I wish
optimization theory was taught and emphasized more widely since its such an
amazing topic.

~~~
repsilat
A lot of people are suspicious of integer programming because it's NP-hard,
and there can a bit of wizardry involved in formulating a problem well. It
doesn't help that the mainstream solvers are black-boxes packed to the brim
with their own peculiar heuristics.

I definitely agree that it's undersold, though.

Nonlinear programming I'm a bit more suspicious of, but mostly because I have
a lot less experience with it.

Dynamic programming, on the other hand, I don't think should be taught in the
same way at all -- to me it's something in the algorithmic toolbox, slightly
less vague than "divide and conquer", mostly just a way of thinking and
approaching problems. It's incredibly useful once you "get it", but I don't
trust anyone who tries to systematize it by trying to reduce it all down to
Markov decision processes and the Bellman equation. Maybe I can get behind
someone saying "Dynamic programming is upside-down memoization," though.

In some sense all of these should "come before" machine learning because
they're often used in particular machine learning algorithms. On the other
hand, most ML pedagogy is at a higher level than all that, so maybe it's not
needed so much.

~~~
jules
Convex optimization is very robust and the algorithms are easy. In fact it is
far easier to implement a good interior point method for linear programming
than implementing a good simplex method. Sadly in most schools they only let
students solve tons of linear programming problems by hand by the tableau
method. It's incredibly boring and frustrating and students don't learn
anything.

~~~
wolfgke
Interior point methods become ugly if the feasible region is not full-
dimensional. Then you have to find the affine hull, find a base of the affine
hull, project down etc. For all these steps you have to pay attention to the
numerics etc. Not nice. So I think (working in mathematical optimization, but
not on interior point methods) implementing a good interior point method is
only considered easy because you usually don't care about these ugly parts.
The simplex algorithm is less problematic since the only ugly situation that
(you could imagine) occur here is that the polyhedron has a non-trivial
lineality space (thus has no vertices). But this can't happen since at the
beginning the simplex algorithm converts the polyhedron into A x = b, x >= 0^n
form and this class of polyedra is pointed as you can prove easily.

Even in the nice case interior point methods are only nice as long as you are
interested in the optimum value. As soon as you also want properties from
polyhedral combinatorics for your solution (say: a feasible base), interior
point methods become really ugly, especially if the face where the optimum
value is attained is not just a vertex.

Disclaimer: I'm working in mathematical optimization, but not on interior
point methods.

~~~
jules
What do you mean by a not full dimensional feasible region? Interior point
methods can handle equality constraints just fine, so do you mean two
inequality constraints that conspire to be an equality? I think any method
that uses inexact arithmetic will have trouble with that.

The very nice feature of interior point methods is that they also work for
non-linear optimization, which has far broader applicability. That's why in my
opinion courses that teach simplex but not interior point are outdated. With
interior point you simply get far more usefulness in fewer lectures, so if you
have to make a choice, make it interior point.

~~~
wolfgke
> so do you mean two inequality constraints that conspire to be an equality?

Exactly. This can be solved by the described method (but there are probably
better ones).

> The very nice feature of interior point methods is that they also work for
> non-linear optimization, which has far broader applicability.

I would not argue with broader applicability, if adding a constraint after
finding an optimal solution requires a complete new fresh start (the dual
simplex algorithm doesn't have this problem). This is very important for
cutting plane algorithms for solving (mixed-)integer linear programs.

~~~
jules
Interior point methods can benefit greatly from a hot restart too. You simply
initialize it with the previous optimal point, and you start from there.

For integer linear programming, sure, you should probably use simplex for
solving the relaxation. I'm talking about continuous problems in engineering,
statistics, machine learning, etcetera. Very few of those problems are linear,
and you don't care at all about the corner case you described. That is a
completely different world than combinatorial optimization. In continuous
optimization if you have two or more inequality constraints that make the
space lower dimensional that simply means that your problem statement is
wrong, because the parameters are not exact, and with a small variation in the
parameters your space will become empty.

------
bjornsing
I was taught the variant with two linear systems at each step. I don't
remember the exact details, but the general concept of a "dual to the problem"
made an impression on me.

