Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The problem with these build tools is never the tool itself (though admittedly I never get far enough to test them). It's the lack of packages in the repositories that back them up. For build2, the central repository is https://cppget.org/ which has 24 packages. 24!! For comparison, npm has more than 600,000 and PyPI has more than 140,000. Now of course not all of those are great quality, but it does illustrate the scale of the problem.

More concretely: As a go-to example for these repos I always look for Google's protobuf library. Most projects I work on depend on three or four high-level libraries which, in turn, depend on protobuf; if even protobuf isn't there (as is the case at cppget.org) then it's hopeless that those higher-level libraries will be.

So far, the best native package manager I've come across is vcpkg [1]. I hate that it's made by Microsoft, and I hate that it uses CMake, but there is no easier to build and use e.g. gRPC on Windows. They recently added Linux and Mac support (static linking only). Edit: for fair comparison, it has just over 700 packages.

[1] https://github.com/Microsoft/vcpkg



I don't believe we can get to thousands of packages by packaging a substantial part ourselves -- we will just get bogged down in this instead of building a decent toolchain. Instead, the "secret" I believe is to make creating, testing, and publishing C/C++ packages as easy and frictionless as possible. This is how you get to thousands of packages and this is what we are trying to do.


This is exactly what linux distributions do. Big ones with lots of contributors package around 10k libraries themselves. Small one-man shows package at least a few hundred common c / c++ libraries.

This is also why I'm not terribly interested in these language-ecosystem package managers. Good C libraries, such as I use in my own C projects, keep a stable backwards-compatible interface for years.


Any sufficiently large software project can't rely on the distribution to provide anything for them. The macOS and windows application distribution models are much saner in that regard. The goal should be a self contained, easily reproducible build with minimal external system dependencies. This is why build systems like bazel are so great: They force you to be very explicit about what you depend on and give you a stable and reliable way to reproduce your build once you've specified that.


> Any sufficiently large software project can't rely on the distribution to provide anything for them.

This is factually wrong. I'm able to compile things as complicated as clang/llvm, firefox, and LibreOffice relying mostly on dependencies installed through the distribution's package manager.


It is not, just take a look at the third party directory of chromium, where it tracks all its third party dependencies

https://github.com/chromium/chromium/tree/master/third_party

the main point of their build system gn (similar to bazel) is to have explicit and distribution independent control of all dependencies (typically tied to specific versions).

LLVM basically has no system dependencies as well, instead it implements a lot of platform abstractions within the repository (https://github.com/llvm-mirror/llvm/tree/master/lib/Support), the remaining dependencies are mostly build dependencies such as python 2.7 and cmake.

Same story for Cryengine (https://github.com/CRYTEK/CRYENGINE/tree/release/Code/Libs) and Unreal Engine (https://github.com/EpicGames/UnrealEngine/tree/release/Engin...).

Linux distributions simply duplicate a lot of work because they are based on a flawed and outdated distribution model, outdated because the main argument for shared libraries disappears with > 1 TB commodity hard drives. The approach taken by the projects above is also the only way to do cross platform development, simply because both Windows, macOS and Consoles don't have the concept of an operating system package manager. (If you are using homebrew for development on mac, you will run into a lot of trouble sooner or later).


Clang, Firefox and LibreOffice are all likely to be in distro packages themselves, so their build dependencies have to be there as well. Building something of similar complexity that isn't already packaged would be a fairer test.


The distro package manager is there to provide your day to day apps and the libraries to support it.

Sooner or later you will run into an issue where your development work will depend on a (version) of a library that will break an app you depend on.

Or you will need two versions of a library for your previous and next release which both need maintenance.

Nor do they provide reproducible builds and other features that are needed for serious work.

Distro package managers are not suited for development and dependency management. They might work most of the time, but that is not good enough.


The distro package manager is there to offer a menu of compatible known-good dependencies that you can readily adopt. It's a ton of work you don't need to do. If instead you bundle random stuff committed last month, you should have a compelling reason for beginning to maintain a half-baked distro all by yourself.


Which is all fine and dandy if one only cares about Linux distributions, the world is bit bigger than that.


I leave this feedback so that hopefully it may be useful to you, but when I was investigating rewriting my build infra a while ago, I looked at build2 and didnt use it for this very reason. I know that I'm one person, but the fact that the gp is the #1 comment tells me that other people had similar thoughts as well. Something that might help kickstart this is if a few of the big packages were up there: openssl, libprotobuf, opencv, google test, etc. If those really big ones were there, that would draw people to add in the other packages that go with those thereby expanding the available packages. But now, since theres nothing, you would have to do everything by yourself in a new system which is daunting for anyone. (Also, by maintaining those packages, you learn what the pains of maintaining the packages are and so as you "get bogged down" you can solve those problems making it easier for others to maintain their own packages)


Yes, all valid/good points, thanks. We are packaging external libraries that we need ourselves (libmysqlclinet was "fun" -- it is actually a C-library with C++ implementation details). And getting some "critical mass" of packages done by us is also something we are considering.


Is it at least able to consume CMake projects so we can mix and match until more packages are available?



Not only are there only 24 packages, 8 are related to build2 itself. cppget has been around since 2016, and at this point I feel it's safe to say it's dead in the water.

The problem is that there isn't a nice way in C++ to separate modules into discrete, importable components the same way that you can in Python, JS, and others. This leads to fractured imports and strange build processes.


Single file header only libs? https://en.wikipedia.org/wiki/Header-only

Maybe Clang or GCC could be modified to output these header only libs. The other option, is to use the package specific compilation tooling but generate WASM. Then use WASM as the module system.


Having a large body of existing packages would be nice, but if this can be used with a private repository, it's be hugely valuable to me even if it had no open source packages available at all. I work in a corporate context where projects frequently have multiple complex internal proprietary dependencies. The hacks and workarounds I've seen just to get a sane build up and running in this context are unbelievable. After having used Cargo for some personal projects, I yearn for a similar tool in the C / C++ landscape.


> if this can be used with a private repository

Sure, you can run both archive-based and version control-based (git) private repositories without any restrictions.


Right, I see that in the docs, which is awesome!

My comment was more responding to the claim that this wouldn't be a useful tool without a large body of open-source packages, which I disagree with.


Actually vcpkg designed for this. It is a package managemer for your packages. You have to maintain a submodule of vcpkg.


My best experience with a build tool has been FB's Buck (buckbuild.com), half for it not limiting you to just C/C++, and half for buckaroo (buckaroo.pm) making installs of common packages--like protobuff--easy.

I find that the second limiting factor I hit is being able to easily interop with other languages and to cross-compile for other platforms. Buck makes that super easy, too.


Interesting; I haven't used this tool, myself. I want to say that I've heard of it, before, but I'm not sure if I'm confused on that point (or if I just really want to have heard of it before).

Looking over the docs, it seems like a rather all-encompassing build tool. I might have to try it out for the one project that I'm still paying attention to which is C++ based and uses Hunter[0]. I see a lot of references/examples talking about using this for Java projects, but I haven't come across anything related to C/C++ (other than the cxx_\* calls, which appear centered around bringing in a native dependency). Any suggestions for a good quickstart project out in the wild or a larger open-source codebase (C/C++) that uses it?

[0] Or it's less official, but more commonly used name, "f!cking hunter", followed by loud, anger-ish sounding noises.


npm had 24 packages at some point too :) Rome wasn't build in a day.


Not to mention that in the world of npm, a package is often just a single function (like leftpad).


Yeah, I can't see many C/C++ types swallowing that. The cost of writing a single function tends to be much lower than the cost of maintaining an external dep, as leftpad proved.


That's probably true for js too (the leftpad debacle is a good example), but so many just glue together the magic library code without realising just how simple some of those functions they are pulling in are.


I upvoted you. But what is wrong with being made by Microsoft? When the license is great.

But I agree with the CMake part. CMake is abomination. CMake is diabolical. But it is not vcpkg problem. It is C++ community program. They chose CMake because it was the facto build system for most packages.


--- While your point is completely fair in that 24 packages won't make a very useful service, it's also fair to say this is a new tool and it's OK to give it some time. --- (EDIT: for the sake of transparency, I'm crossing out this initial paragraph since, after reading some other comments, I've discovered that cppget isn't all that new; the rest of my comment, however, is relevant without that statement)

I'll poke a little at your assertion that it's "never the build tool itself". Unfortunately, in my experience, it often is. Have a nice fight with Hunter in a project or two. It's got a decent, though still somewhat small, number of packages[0], but that doesn't make it useful. Every time I reload my PC and clone the two repos that rely on it, I invariably spend a day sorting out something or another (last time it was signature related). It doesn't help that the tool is tied to CMake, these are the only two repos I use that rely on CMake and I have actively avoided learning enough about it to not require several Google searches when I need to do somewhat simple things, but that's the rub. Cargo is simple, even npm (by comparison) is simple even for projects with great complexity[1]. Hunter, by comparison, has me asking every single time I have to update a dependency or install fresh from a clone ... what problem is this solving for me? How is this less manual than managing dependencies with a clever zsh script[2]?

As far as vcpkg is concerned, I agree, though I'm a Microsoft fan[3]. It generally works -- and unlike Hunter, which uses CMake as well, I haven't run into the kind of hair pulling, four-letter word spewing fits using it ... at least not to the extent that I did.

To the authors: Really, though, please please keep investing in making better dependency management tooling. If it's good enough, there are enough of us that will make "unofficial packages" because it takes less time to figure that out than it does to figure out how to fix the broken dependency management system we're stuck with.

[0] https://docs.hunter.sh/en/latest/packages/all.html

[1] I'm certainly not saying it's not without its pointy corners, but I'll take those sharp edges any day over some of the C++ build tools I've played with.

[2] Which is the only "build dependency management" system that's worked for me -- I literally "lynx --dump" the releases page, parse for URLs, download and script out each build dependency manually ... I even have a set of functions that manage to do some of this in a relatively well abstracted manner given the circumstances. Granted, these are single platform target apps, but so are the handful I have that use vcpkg.

[3] Well... to be fair, my one Windows 10 install is in a virtual on an openSUSE Tumbleweed installation, so maybe I'm not a fanboy, but with their movement towards open sourcing a lot of their tooling and frameworks, there's not a lot I can complain about, personally.


I don't quite get what issue you had with Hunter, thought if you actively try to not learn the standard tool - CMake, then you're gunna have problems... It works flawlessly for me. I literally clone a repo, give it the corresponding toolchain file and it builds everything from the ground up on any system instantly. I really feel the C++ dependency issue has been resolved conclusively

I just wish Hunter was part of CMake by default b/c CMake should be able to load other CMake projects and resolve dependencies. It itself has all the information internally to make this really easy - ie. targets, interfaces and dependencies. Handling this from outside of CMake is just a much harder problem and you end up having to restate a lot of what's already in the CMakeLists.txt.

ExternalProject_Add() is a giant footgun that should have never been part of CMake - I wrote a little description of why here: https://geokon-gh.github.io/hunterintro.html

There are two outstanding issues:

- if you need to link to proprietary libs then the rebuild-everything-from-scatch strategy is deficient - but then things are going to get messy anyways..

- if libA links to libBv1 and libC links to libBv2, then there is no clean way to link both libA libB and libC in to an application. Usually libraries aren't too picky about which version you link - but there is always a chance things will blow up in your face. This is probably going to be solved with flatpaks runtimes or something like that


...And your point is fair - a lot of the difficulty to me comes from fighting with CMake, which, due to the fact that I don't support a large number of applications that rely on it is a piece of technology that I don't retain enough knowledge on when I end up having to use it to be particularly strong at working with.

I remember one of the most frustrating incidents that I had with Hunter was, indeed, a CMake issue of my own (well kind-of) causing. On a one-off machine I was running a build on, I was getting signature failures every time Hunter went to download code. I checked everything (and, of course, there were signature-related things that looked like they might be a problem but probably weren't). It turned out that the cmake binary on this host was not compiled with SSL support. I guess the folks who wrote error messages for earlier C++ compilers wrote error messages for Hunter/CMake, because at no point did any error offer any hint about the binary lacking SSL and it wasn't until I decided to download the source code to the latest CMake (because the packaging system I was using didn't have the latest). I can't remember the completely specifics (i.e. if hunter/cmake relied on cURL and it was libcurl/curl itself lacking SSL support), but I remember I happened upon the solution by accident while reviewing the compiler flags of the current version of the installed bits to figure out what to use to build the latest version and I realized that the URL to the dependency was https and there was no way that was going to work without the --enable-whatever flag required to support TLS :).

So yes, it's possible I am or have been blaming hunter when I should be blaming cmake, or more accurately, my lack of knowledge in using both tools. At the same time -- and I know this is a really unfair comparison -- I support very few things in Python or node, however, the grief I encounter when I return to that tooling is minimal. It requires a lot less out of me to get things working even in complex applications.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: