Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's no way to depend on specific versions of C or C++. The actual binaries vary depending on architecture, compilation toolchain settings, compilation flags, link options, and all sorts of other things. There's really no way to address those built artifacts even if they were available on pypi or something.


There's a couple different ways.

1. Build the artifact that you depend on, publish it on "pypi or something" referenced by both the version of the library and a hash that uniquely identifies the build. This is the approach taken by conda, for better or worse.

2. Only allow one canonical build of a library you depend on so that the "version" becomes a unique identifier for the build. This is the approach taken by distribution package managers.

3. Create metapackages that describe ABI constraints that are required by packages and must be satisfied by the underlying system. For example, a "cxxabi" package could be provided by the underlying system, and the packaging tools could automatically add dependencies on "cxxabi" at build time, based on either an exact pin to the library built against, or in some cases, a relaxed dependency by inspection of versioned symbols used by the binary.

4. Statically link all your dependencies and/or vendor all your dependencies. These are used by quite a few pypi packages that depend on standalone C libraries to avoid most of the issues altogether.


Of course, all of these either have flaws [1] or are so detailed that they're distributed build caches with more steps. You can hash project source files, all build commands, all textually included headers, precise versions of toolchains, etc., into a Merkle tree, but this is not generally how python applications pin "versions" of dependencies.

[1] For instance, you cannot version a C or C++ library build independent from the versions of all of the transitive dependencies of that library (more or less). Of the options listed here, only distribution package managers can really account for this problem, and not every distribution package manager cares to.


My thoughts are that viewing binary package distribution/archival as really just a distributed build cache is the only real way to go, and that's exactly why pypi and associated tooling has so many flaws.


Well, there is: It's Nix's buildPythonApplication.

Trying to use Python in Nix is frequently a bother, but at least once it works, it works.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: