
Kotlin and linear programming - andrewtorkbaker
https://tomstechnicalblog.blogspot.com/2018/01/kotlinforoperationalplanningandoptimiza.html
======
graycat
> “Linear” means continuous

No. As in G. Simmons, the two pillars of the field of 'analysis' in math are
linearity and continuity. The two are quite different.

Linearity usually has to do with numbers and vectors. The numbers are usually
in the set R of real numbers or the set C of complex numbers. Commonly the
numbers are called scalars. Then, a function f is 'linear' provided for
scalars a and b and vectors x and y we have

f(ax + by) = af(x) + bf(y)

The role, utility? It's an enormous, powerful simplification and, in
particular, is crucial to systems (high school style) 'systems of linear
equations', linearity, essentially 'super position', in quantum mechanics, and
linear operators, as in the classic Dunford and Schwarz, e.g., as in the
results of S. Banach. In calculus, differentiation and integration are both
linear operators.

And, in particular, linearity is crucial in both linear programming and
integer linear programming. Local linear approximations are crucial in non-
linear optimization, e.g., the Kuhn-Tucker conditions and their associated
constraint qualifications.

Continuity is also a biggie, e.g., for a function to be continuous on a
compact set, e.g., [0,1], means it is also uniformly continuous, bounded, and
achieves its greatest lower bound and least upper bound. Continuity,
compactness, and uniform continuity are the standard assumptions that
guarantee that the Riemann integral of freshman calculus exists.

Maybe what was meant was usually when we mention linear programming we have,
say, find x to solve Ax = b, x >= 0 where x, where for positive integer n, is
n x 1 and, thus, for the set of real numbers R, in the set R^n. Here the A is
a matrix, say, for some positive integer m, m x n. Then from the beginnings of
the start of matrix theory, for real a and b and n x 1 y we have

A(ax + by) = aAx + bAy

which says that A is a linear function (operator, transformation, etc.).
That's the 'linearity' in linear programming. And it holds in linear integer
programming where we ask that some or all of the components of x be integers.

~~~
sevensor
> Maybe what was meant was usually when we mention linear programming we have,
> say, find x to solve Ax = b, x >= 0

Close. Linear programming is: minimize z = cx subject to Ax <= b. A and b
customarily force x >= 0. Linear programming can be solved by application of
the Simplex method or interior point methods. Integer linear programming
constrains x to the integers, and mixed integer linear programming constrains
only some of x to the integers. (x is a vector.) Integer linear programming
problems are often optimized using branch and bound or branch and cut. The
example in the article isn't great for these methods for a couple of reasons.

1\. There's nothing to optimize. This is an assignment problem that's only
about feasibility.

2\. The constraints are pretty restrictive. Tree-based search can really spin
its wheels trying to find feasible solutions.

Edit to add: Ax <= b is a system of linear inequalities, and z=cx is a linear
equation. That's where the linearity comes in.

~~~
graycat
No. I'm correct: Not only can we do linear programming with equality
constraints Ax = b, for the simplex algorithm that is what we must do. To
convert a linear inequality to an equivalent linear equality, we use a non-
negative slack or surplus variable. To get an initial feasible solution, we
just append one via artificial variables and then use the simplex algorithm to
drive the artificial variables to zero and out of the problem. Then all we
have are the original variables and the slack and surplus variables. In that
case, we know that the problem is feasible, and as the OP mentioned sometimes
that is enough. Actually, in principle finding a feasible solution is no
easier than finding an optimal solution starting with a feasible solution.
That is, feasibility alone is not trivial. The field of constraint programming
is basically looking for feasible solutions and, so, is not really much easier
in the work to be done or different from optimization.

No. Sure, the objective function z = cx is linear. But far and away, what is
just crucial, the real power that makes linear programming work, is the
linearity of the matrix A. Then the feasible region is a finite intersection
of closed half spaces and is convex with flat sides and extreme points. To
find optimal solutions, it is sufficient to look only at the extreme points,
and there are only finitely many of those.

We can do a lot of relaxing of the objective function and still do well;
relaxing the linearity of the constraints promises to give us much more
trouble.

------
sevensor
Adding to my other comment, my biggest dissatisfaction with this article is
that all it did was show the quite excessively verbose problem setup, without
actually showing an algorithm for integer linear programming. An interesting
post would have explained how to _implement_ branch and bound in Kotlin, not
how to call somebody else's library. Needless to say, I'm not sold on Kotlin
from reading this.

~~~
deepsun
Also, I believe no one is doing linear programming without vectorized
operations (SIMD) nowadays.

I know JVM optimizes small methods, so maybe their JIT optimizer does that
automatically, but I'm not sure that optimizer is better that manually
optimized code like in numpy.

~~~
MaxBarraclough
I suspect you're right. Java doesn't have the best track-record when it comes
to SIMD.

Unlike .Net with System.Numerics.Vectors, they're not even interested in
making a platform-independent SIMD library for manual optimisation. The JVM
holds contempt for such real-world optimisations. I see that Intel have made
one though - [https://software.intel.com/en-us/articles/vector-api-
develop...](https://software.intel.com/en-us/articles/vector-api-developer-
program-for-java)

At a glance, it looks like Oracle have dabbled with AVX in HotSpot, but aren't
taking it very seriously.
[https://www.google.com/search?q="-XX%3AUseAVX"](https://www.google.com/search?q="-XX%3AUseAVX")

------
cschmidt
If anyone is interested in solving LP's, I'd suggest taking a look at the JuMP
library in Julia. It has a lovely interface for specifying math programming
problems.

[http://www.juliaopt.org/JuMP.jl/0.17/index.html](http://www.juliaopt.org/JuMP.jl/0.17/index.html)

------
zmmmmm
Without wishing to disparage the article at all (I really love this kind of
article, including this specific one) ...

I feel like Kotlin is almost exactly the wrong language in this space. You
either go all the way to a more powerfully typed language where Scala has
grabbed mindshare, or you go fully towards dynamic languages like Python or
(my favorite, even though it has almost no mindshare in this space), Groovy.
It really feels like all Kotlin did here was add verbosity and cloud the
actual question being answered with more syntax.

~~~
RhodesianHunter
In what way is Scala "more powerfully typed"?

~~~
zmmmmm
This kind of thing:

[https://www.atlassian.com/blog/archives/scala-types-of-a-
hig...](https://www.atlassian.com/blog/archives/scala-types-of-a-higher-kind)

------
bitL
Not sure why Kotlin would be picking this fight - Python and to some extent R
already dominate in data science with a little bit of Scala added in Spark -
where does author see an opening for Kotlin? For high performance libraries
nobody would pick Python or any JVM-based language either, and that's what
most of the wrappers end up calling anyway (C++, CUDA, Fortran, OpenCL).

~~~
SmirkingRevenge
Python is often a second class citizen in these areas, even if much is made
about its widespread support in the data science realm. I don't want to say
it's hype is overblown... but a lot of the pain points and cracks in the seams
are glossed over.

Take spark, for instance. You run into extreme performance issues the second
your data has to be serialized to cross the py4j gap. An many essential parts
of its API require scala/java (presumably Kotlin ought to work as well).

Similar situations occur all across the big data and cloud realm, with python.
And then even still today.. you'll run into situations where whizzbang data
science ml library that solves your exact problem.. for some reason is python
2.7 only. Thankfully that situation is getting rarer (there really isn't an
excuse for it today) - but its still there.

In any case, a "not-java" language that can talk java is freakin amazing, in
my book (Scala doesn't scratch my itch there - it's far too clever - had
enough of that with perl back in the day).

~~~
lmm
Scala is a lot simpler than Kotlin once you get into the details - it uses a
few simple but very general features rather than a lot of ad-hoc language-
level functionality. A lot of the time what looks like some complex construct
in Scala is actually two or three separate features combining in a way that
makes perfect sense once you look at the pieces, and you can click through to
see how the thing is implemented in plain Scala. (Indeed I'd say Kotlin is far
more perl-like - its design deliberately emphasises immediate developer
convenience at the expense of having a coherent underlying model)

~~~
SmirkingRevenge
Interesting perspective, thanks. I'm not proficient with scala, and most of my
attempts to learn more of it were to be able to read some bit of code in a
tool or library that was giving me trouble. So... grains of salt and a that.

That being said, having gone through at least one book, and have read lots of
code - I still find it very difficult to parse in my brain.

It _feels_ more complex to me, because there seems like there are so many
implicit things going on, where something like Kotlin makes efforts to be
explicit (yet still strikes a nice level of conciseness). But that could also
be my inexperience with scala.

~~~
lmm
Definitely read using an IDE rather than a text editor if you weren't already.
Scala is the first language I've seen to really make intelligent use of the
GUI - not in a dragging-boxes-and-arrows way, but with implicits that are
visible subtly (green underline, expandable by mousing over) but not there by
default. The way I see it, it's a novel middle ground for things that you
don't want to be completely invisible, but don't want taking up a lot of space
either, e.g. error handling (exceptions are completely invisible, go-style
explicit error handling is too verbose) or access to mutable state, or async
operations, or... (the for/yield sugar is similar, with a lightweight "<-"/"="
distinction between effectful operations and not, and then you mouseover to
see what the specific effect is). Working with inferred types is also easier
when you can mouseover or hit a hotkey to get the type of any expression.

Having different syntax for different kinds of effect as Kotlin does could be
seen as more explicit I guess, but I find it actually makes it harder to work
with - you have a lot more syntax to keep in your head, and you can't write
generic code that works with multiple different effects. Other than that I
don't think there's anything more explicit in Kotlin, and some things - error
handling in particular - are definitely less explicit.

------
aisofteng
>Linear programming (also called linear optimization) is an applied field of
mathematics often used in operations research and planning. It attempts to
find an optimal solution to a planning problem when a set of business
constraints exist.

No, linear programming is an applied field of mathematics that attempts to
find an optimal solution to a goal within a system which is under some sort of
constraint(s). Whether those constraints are business constraints is
irrelevant; the theory exists outside of any particular application.

------
stuaxo
I kind of glazed over when I saw all the algebraic notation. I'm sure there's
a simple concept buried in here somewhere.

------
RacerRex9727
Part II of this article series does a real-world application.

[http://tomstechnicalblog.blogspot.com/2018/01/kotlin-for-
lin...](http://tomstechnicalblog.blogspot.com/2018/01/kotlin-for-linear-
programming-part-ii.html)

------
santialbo
Just adding that there are tricks that allow adding (apparently) non linear
conditions to the problem such as variables that activate or not based on
other conditions or even if-then-else statements.

Without these tricks the technique is pretty limited at solving real world
problems.

------
tw1010
Good article, but man, no love was given to the math typesetting.

