
Does C++ need a universal package manager? - signa11
http://pfultz2.com/blog/2017/10/27/universal-package-manager/
======
rdtsc
Wonder if Nix package manager would work here. It is not strictly tied to the
OS unlike deb, rpm etc. It should work in most Linux distributions and MacOS.
And it is not a single language. Mono-language package managers work for a
while, then all of the sudden needing to deploy other languages / framework
and it becomes awkward.

Moreover it emphasizes reproducible build and seems to handle dependencies in
a sane way.

~~~
arximboldi
I came here to say the same. Recently I have started using Nix as a universal
package manager, every project I start starts with a `shell.nix` file with a
pinned <nixpkgs> commit and some line to bring in gcc or make. I always build
in a nix-shell --pure to make sure I didn't miss anything. The system is so
great and works so beautifully it's even hard to explain.

Nix has very sensible defaults and defining a Nix expression to install a
cmake or autotools package is a breath. But even projects with just Make or no
build system at are easy to install, you just need to copy the output to the
/nix/store in the installPhase (to $out). And I can easily define Python or
any other language dependencies in the same consistent and concise syntax. I
have no fear anymore I won't be able to install dependencies in the future.
Any other dev just does `nix-shell` in the project root and they are sure they
are ready to go. I even used to hate the Nix syntax (as opposed to Guix) but
at some point it "clicked" and now it makes a lot of sense.

Only drawback is no Windows support. Maybe someone at Microsoft could help
with that? It would be so sad to have the community converging to a suboptimal
solution just cuz Windows... :/

~~~
twic
This sounds like it could be useful for me. I've never so much as looked at
Nix; if i go and read the standard documentation, will i learn how to do this?
Or is it a creative application of the tools?

Does Nix make it easy, or even possible, to bring in dependencies from legacy
sources on the network? For example, my company has an FTP server full of
carefully-packaged, versioned libraries in tarballs, and it would be great to
be able to use those, rather than having to repackage everything.

~~~
arximboldi
To your last question: yes! In a "Nix expression" defining how you build the
package (i.e. a "derivation") you can specify where the source is fetched from
via any mechanism (e.g. http, ftp, git, etc.)

The documentation is good but it can sometimes be a bit rough. There are some
good blogposts on how to achieve specific things that you may find when
searching around. Also the IRC community seems is quite active, responsive and
friendly :-)

------
emsy
I'm surprised the author forgot to consider platform architectures, because he
has the most important aspects of a package manager down.

Unfortunately, it is still terribly easy to write non-portable code with C++.
I think this is the main factor why we still don't have a widely used package
manager for C++ but your mileage may vary.

~~~
pfultz2
> I'm surprised the author forgot to consider platform architectures

That is mainly for binary distribution. Most C++ package managers currently
are source-based because binary compatibility is not a easy problem(there are
some package managers trying to solve it though).

> Unfortunately, it is still terribly easy to write non-portable code with C++

Yes, it easy to call into a platform-specific API, so cross-platform
portability is limited by the testing resources for the package maintainer or
library author.

~~~
emsy
I was talking about source code, not binaries. And I didn't mean platform
specific APIs either (though that is a problem as well). I was talking about
differences between processor architectures. But while we're at it, try using
wstrings or snprintf across Windows/Linux.

------
gmueckl
C++ cannot be covered by package management in the same way interpreted
languages are. The problem space is vastly more complex. It starts with
optional language features that often get get disabled in projects (most often
RTTI and exceptions), continues with incompatible ABIs between compilers and
even compiler revisions and does not stop with the presence of sometimes
countless compiler flags for optional features and dependencies.

For all of these reasons it is not possible in the general case (the one that
package management needs to target) to simply distribute binaries.

Linux distributions are not a working example here because the libraries are
built for the use of the other packages in the same repository. Needs of third
parties are never considered because finding library configurations that work
for the packages in the distro is hard enough.

A platform independent package manager for C++ must effectively degrade into a
build system that rebuilds all dependencies from source. And it must allow
build parameters for each step to be defined by the dependent libraries or
programs, so that the result is something that can be linked together
successfully.

~~~
cjwoodall
To be fair some modern compiled languages are covered by package managers. For
example, Rust and Cargo. Admittedly there are constructs built into the
language and design that have made the plausible since early on.

~~~
kibwen
Even above any technical aspect, the most important thing about Cargo's
success is that it was the official default from day one. There's hardly any
question about how packages should build or dependencies should operate. And
indeed, Cargo does encourage (though not require) everything to be built from
source, which is probably intentional given Mozilla's stance on OSS.

~~~
steveklabnik
It has nothing to do with Mozilla, specifically. Binary packages are very
tricky. I'm sure Cargo will figure it out and eventually grow support, but
this isn't some sort of top-down policy.

~~~
crzwdjk
It would be kind of difficult to have binary packages unless you have a stable
ABI, and Rust doesn't have that yet. If and when it does get a stable ABI,
binary packages would at least become a possibility. There are good reasons
why Rust didn't want to commit to a stable ABI initially, but perhaps at some
point the benefits will outweigh the costs.

~~~
steveklabnik
You can do it, it just adds another axis to the matrix of things you have to
support. It's a significant drawback.

Regardless, the point is that all of this is technical and has nothing to do
with Mozilla's love of open source.

------
Jgrubb
I think every programming language would benefit from a universal package
manager.

~~~
exDM69
... which doesn't exist at the moment. Maybe Nix or Guix could work, but their
adoption is rather low at the moment.

~~~
pjmlp
Only if they would exist in any kind of OS, including those without any kind
of UNIX like layer.

------
MichaelMoser123
What about the C and C++ runtime library? (together with its quirks, platform
incompatibilities and different versions for every build of some os). This
little dependency alone will require you to recompile every nontrivial piece
of software from source, or to have something like debian - it needs the
source for every component.

~~~
gumby
I don’t see why a recompile should be required — that’s why we have platform
ABIs. I routinely interlink g++-compiled and clang++-compiled code without
issue.

~~~
MichaelMoser123
Try using a library compiled on a slightly different version of Linux.

------
singularity2001
in a way apt and yum are package manager for c++

~~~
exDM69
Nope, they solve different problems. The OS package manager (apt, etc) is
there to provide you the OS install with a set of _applications_ and a set of
libraries _to support those applications_.

Your development, on the other hand, will depend on a _different set_ of
dependency libraries (and this must be reproducible!). This may include stuff
like having several different versions of a library (for different projects or
different branches or build configs of the same project). This set of
libraries will sometimes conflict with what you need for the apps the OS
provides.

Because you never want to break your day-to-day applications supported by the
OS package manager, you'll need a different way of installing your depepdency
libs. This is what language package managers do.

Right now for C and C++ the alternatives are 1) bundling it with your source
2) use version control (e.g. git submodules) 3) make install
--prefix=$HOME/foo and configure build for that 4) single header file
libraries. None of these work particularly well compared to a proper
development package manager like Cargo.

~~~
Shorel
In practice, there are libraries and also dev-libraries which actually add
headers and link-objects to use these same libraries.

IMO you use submodules only for libraries not already included in your package
manager.

~~~
exDM69
> In practice, there are libraries and also dev-libraries which actually add
> headers and link-objects to use these same libraries.

When developing software, you need both installed. Not every distro has -dev
packages to install headers.

> IMO you use submodules only for libraries not already included in your
> package manager.

This approach is full of problems, most importantly that this isn't reliably
reproducible (without using docker images or something for a reproducible
environment). _Your_ package manager may have different libs or different
versions of libs than another developer's package manager.

But the bigger issue is version requirements mismatches. Say your web browser
(or other must-have app) requires libwidget-1.0, but you're writing an app
that needs libwidget-2.0 or libwidget-0.8 or both (e.g. your latest supported
release was 0.8 and next release is 2.0). You can't install 2.0 or 0.8 because
it would break an app you need for daily work.

Using your distro package manager for development is a partial solution at
best.

~~~
twic
> Not every distro has -dev packages to install headers.

Really? I would be surprised if every distro didn't have a mechanism to
install the headers and static libraries for all of its dynamic library
packages. How would the distro's own build process work otherwise?

> But the bigger issue is version requirements mismatches.

This is the killer. Distros are based on the idea that there should be only
one version of any package installed at once. A development package manager
needs to be able to handle many.

This feels like it should be relatively straightforward to add to distro
package managers, though. You would need a package-space concept analogous to
a container - a consistent set of packages separate from the host's, and some
way to run a build process within it. So i can say:

    
    
      dnf create-pkgcontainer ~/dev/myproject
      dnf --pkgcontainer ~/dev/myproject install zlib-devel-1.2.8
    

Then inside ~/dev/myproject i have some sort of metadata listing what is
installed in that container - say, a bunch of symlinks to pkgconfig files, a
ld.so.conf and a ld.so.cache, etc. Then i can say:

    
    
      dnf pkgcontainer-run ~/dev/myproject make
    

And make will somehow pick up the environment inside the container, rather
than the global environment. That's easier said than done, of course, given
how many different independent parts of the toolchain there are, and how
intertwined but uncoordinated they are. You could use the sledgehammer of
constructing an actual container, or at least a chroot, but ideally you
wouldn't have to.

~~~
exDM69
Containers are a workaround and an overkill solution to this problem.

It's a pain in the ass when you need to enter a "blessed" environment to build
your project. By "blessed", I mean a container environment, a chroot or just
some environment variables that need to be set.

> dnf pkgcontainer-run ~/dev/myproject make

The issue with this kind of builds is that they integrate poorly with your
$EDITOR (or IDE). When I run a build, I want the results to be piped back to
my editor so I can comfortably jump to the files with errors and warnings in
them.

And it's not only the build, it's also running your application, running the
debugger, static analysis, linters, etc. Requiring a special environment will
introduce friction to all of these.

Additionally, your suggestion of using "dnf" would require every developer in
the project to use an OS where "dnf" runs. Builds should be reproducible as
much as possible without depending on the OS.

Development package managers solve lots of these problems, although having a
package manager for every language is an issue.

------
entelechy
I'm working on [https://buckaroo.pm](https://buckaroo.pm) a package manager
that builds from source and focuses on reproducible builds.

To accelerate the build we support caching and sharing artefacts

Would love to hear your opinion on that

~~~
pfultz2
Or rather, I would like to hear your opinion instead. Would buckaroo consider
using a standard package metadata similar to what I outlined?

I believe buckaroo uses integrated builds so its likely you would need to
supplement it with other information such as source files, but hopefully the
core information could be reused.

~~~
entelechy
Yes definitely!

The biggest time sink when porting a package was to research a projects
requirements and dependencies. To our surprise only a handful projects do this
correctly.

If we could agree on a standard that captures the dependencies and
requirements of a C++ project correctly it would be a huge win for the
community.

Currently there is no buildsystem that enables users to describe the
requirements of an project correctly.

We choose to support buck as a buildsystem mainly because it allows us to
describe nicely the structure of a project. However like any other buildsystem
it does not capture the full essence of a C++ project.

For instance there is no way to describe the set of compatible compiler and
linker flags.

Many build systems evolved to be turing complete so we can write logic to
handle various combination of options(eg. no-rtti) but this makes it really
hard to reason about projects.

It turned out to be quite challenging to determine if two projects are
compatible with each other - in fact it turned out to be as challenging as
solving the halting problem. Is this really necessary?

I analyzed now over 300 c++ projects and I still haven't seen one project that
truly needed a touring complete buildsystem to be build (Not even LLVM&Clang).

So yes - I'm convinced a metadata file as you suggested is the way to go.

~~~
pfultz2
> Currently there is no buildsystem that enables users to describe the
> requirements of an project correctly.

We want to steer away from standardizing a build system for now. The uses
cases are enormous to try to tackle. Instead, it would be simpler to
standardized the description of the build environment so the package manager
can pass that to the build system.

Hopefully, build systems will be updated to read the package metadata so it
can properly consume the dependencies correctly.

> So yes - I'm convinced a metadata file as you suggested is the way to go.

Good to know we are on the right track.

------
alexnewman
For source packages yes. Deb/RPM are binary packages so not equivalent for
devs. st demonstrated how valuable it is for native code. I think they should
just adopt cargo.

------
gigatexal
Yes it does.

------
peter-m80
Yes

------
nurettin
C++ currently has a package manager. It is called git. Git has submodules for
dependency management, tags for release management and binary downloads, gpg
signing for authenticity and online services for hosting.

~~~
adrianN
A package manager is not very useful if your build system doesn't integrate
with it.

~~~
stochastic_monk
I agree both that nurettin has a great point and that git is insufficient.

A recent failure: I use some projects which fail to compile with clang or more
recent gccs but happily compile with gcc4. Homebrew/linuxbrew don't let me
have multiple gccs. I use a server which supports the module system and covers
gccs from 4 through 7, so I don't have problems there, but I can't compile
some things at all on my laptop.

Similarly, CMake can work well, but it often breaks for me when I try to build
some other peoples' projects, and is much harder to reverse engineer to fix
than Makefile.

------
rootlocus
XKCD comes to mind: [https://xkcd.com/927/](https://xkcd.com/927/)

------
oytis
No, please don't. Let's not bring dependency mess to proper programming
languages. Not to say C++ binaries normally link to their dependencies
dynamically and can't control how they are built (and it's also good).

~~~
cjwoodall
A "dependency mess" exists both with and without package management. A package
management system gives you a playbook to follow with tool support, and
"winging it" lets you suffer and clean up the mess in your own clever way. The
advantage of course is that that clever way CAN work better for your team (for
now)

~~~
oytis
Of course you _can_ create a dependency mess in a C++ project. Package
managers just make it a lot easier. My personal statistics (aka anectodal
evidence) says that a typical Rust project has several dozens of direct
dependencies, while a typical C/C++ project can make it with half a dozen. And
those will most probably be dynamic libraries, which encourages both sides to
care about API stability. Some libraries, openssl to name one, are notorious
for failing to do so, but at least it is expected.

~~~
cjwoodall
I also find that in c/c++ I tend to maintain my own library of useful bits and
pieces, where in another language I would use 3rd party libraries. There are
pros and cons both ways tbh.

Some of my pain is from the embedded side where most libraries I use need a
special set of flags and are compiled to static libraries or directly into the
binary.

~~~
oytis
I was also keeping in mind the embedded case. Yes, here you want to compile
everything statically and with compile flags of your choise. But you'll also
want to keep number of dependencies as low as possible so that porting to a
new platform is feasible. In my opinion git submodules / cmake subprojects are
more that enough for that.

~~~
cjwoodall
I will agree with that. I have investigated other options and always come back
to the same place. I am currently trying to unwedge us from vendor specific
IDEs to cmake subprojects and submodules. At the moment we have a monorepo but
it makes releases of subcomponents a bit sloppy.

------
revelation
A package manager for C++ isn't a thing that just downloads files and puts
them in predetermined places, maybe with a hip website. It needs to understand
ABIs, architectures, standard library runtime versions, symbols and all the
other things that make libraries in C++ such a pain.

Nobody needs another tool like CMake that just goes "whatever bro GL, here is
the compiler/linker command line variable" once you need to do something
beyond specifying input files.

(This is a big area where lots of improvement is possible. I want to kill
someone whenever a linker just spazzes out with "undefined reference". YOU ARE
THE MACHINE, GO THROUGH THE LIBRARY SEARCH PATH AND TELL ME WHICH ONE HAS THE
SYMBOL.)

