
Show HN: Python 3 TSP solver based on LKH (cross platform) - pyentropy
https://github.com/dimitrovskif/elkai
======
rav
Looks like _elkai.c implements a CPython-API function "ElkSolve" that parses
the Python lists into a C array, calls a pure-C function "InvokeSolver", and
puts the result of "InvokeSolver" into a Python list.

It should be easy to implement "ElkSolve" in pure Python using ctypes to call
a pure-C library that exposes the "InvokeSolver" function. That would make it
possible to distribute a single Python wheel that works on both Python 2 and 3
and works across all minor versions of Python 3.

Currently, elkai works on 3.5, 3.6 and 3.7, but the build system requires a
separate build artifact on each, and when Python 3.8 comes out, elkai won't be
available until the author builds it for 3.8. With a pure-C library, elkai
would be immediately available in new Python versions.

~~~
pyentropy
Correct, but you will have to find/build the shared library. Also, as soon as
the build systems support 3.8, I will add 3.8 wheels.

I tried to support all 3.xx platforms that numpy supports:
[https://pypi.org/project/numpy/#files](https://pypi.org/project/numpy/#files)

~~~
rav
I've implemented my suggestion and sent you a pull request:
[https://github.com/pyEntropy/elkai/pull/3](https://github.com/pyEntropy/elkai/pull/3)

The resulting wheel contains the shared library (using milksnake), so there's
no need for the user to install that separately.

Let me know what you think. If you're unhappy about adding Python 2 support to
your library, you can make 'runs' a keyword-only argument to solve_int_matrix
;-)

------
enriquto
I love the "without dependencies" idea. Notice, however, that the algorithm is
implemented in C, and it requires (somewhat strangely) cmake to build. I say
strangely, because being a C source without dependencies itself, it could be
compiled just as easily without cmake.

~~~
derefr
cmake isn't just a make tool; it also serves as a replacement for autotools in
the projects that use it. I would presume that that's its purpose
here—detecting system libraries and syscall availability and such.

~~~
enriquto
my point is that in this case there are no libraries to be "detected" at all.
This is just vanilla ansi C without any requirement besides the standard C
library. This can be compiled anywhere without need to "detect" anything.

EDIT: I put "detect" in quotes, because I find the whole concept of detecting
libraries ridiculous in a context of a C program. Either a library XYZ is
installed on my system, in which case the compiler finds it with the -lXYZ
flag, or it isn't. If it is not installed, I expect the compilation to fail. I
do not expect the build system to look in my hard disk by itself and try to
find some files that look like the XYZ library; that would be extremely
untoward.

~~~
derefr
> I find the whole concept of detecting libraries ridiculous in a context of a
> C program. Either a library XYZ is installed on my system, in which case the
> compiler finds it with the -lXYZ flag, or it isn't. If it is not installed,
> I expect the compilation to fail.

Things autoconf/cmake does:

• decides whether the system version of a lib is "good enough" to use in place
of a vendored copy (e.g. the vendored libxml in Ruby's Nokogiri)

• detects libraries (and system calls) that nominally obey some standard, but
use nonconformant names/types for their implementation of the standard; and
generates shim source files which wrap these functions to give them the
correct name/type to allow your code to blindly link to the standard symbol
and have it "just work" on the system in question.

• generates header files full of defines saying which _optional_
(system/library) features were discovered as being available or not in the
linked version of the (system/library). You can then use these defines to
decide which implementation to use in your code, in such a way that you won't
ever be attempting to reference a (static) symbol that doesn't exist in the
library you ended up linking to.

• for entirely nonstandard APIs (e.g. kernel semaphores), lets you just use
the API of the version you know, and then shims it into the totally different
API used by this particular system. Or, on systems that sometimes have both an
advanced implementation and a basic one, and sometimes only have the basic
version (such as Linux pre/post introduction of io_uring), generates a shim to
translate the advanced stuff to the basic stuff on systems where the advanced
stuff isn't available.

The goal of projects like autotools and cmake, is to _make code portable to
every platform they can_ , even if the platform is utter bollocks. They do a
bunch of seemingly-silly stuff so that e.g. you can write code that calls into
POSIX-standardized system facilities, developing solely on Linux; and then
someone can download your project on macOS and it'll turn out to compile and
run perfectly, without you ever having tested on macOS—even though macOS
doesn't even _have_ the system facilities you called into.

Or, to put that another way: these build tools are to C, as "ECMA$n-shim.js"
is to Javascript.

~~~
enriquto
> Things autoconf/cmake does: (...)

All these things are horrifying! The problem does not seem to be "bollocks"
platforms, but bollocks programs. If you code really depends on a specific
version of a standard library, shame on you.

Anyhow, the code in question here does not need more libraries than "hello.c",
which is naturally portable to whatever platform without the humongous help of
cmake.

------
WoodenChair
I would think "without dependencies" would mean that this was an original
implementation of an algorithm. In fact, this is a wrapper around a C library
that someone else did, so I think the title is misleading.

------
j88439h84
Use cffi for calling C.

