
Bazel For Open-Source C/C++ Libraries Distribution - todsacerdoti
https://liuliu.me/eyes/bazel-for-libraries-distribution-an-open-source-library-author-perspective/
======
jopsen
> ccv has a simple autoconf based feature detection / configuration system.

Hahaha, there is nothing simple with autoconf :D

I supposed if you don't feel like you need to understand all the moving parts
it's not horrific.

To be fair, autotools evolved in a different era, and to solve differnet
problems than bazel.

~~~
liuliu
I think that I understand how most AC_* macros I use internally works. Cannot
claim for AX_* ones. I do write every line of configure.ac myself though :D

------
fenollp
If you want to upgrade your Bazel dependencies using pinned semver constraints
and a lockfile I've made this [0] for you.

It superseeds http_archive and falls back to git_repository if needed. Just
run `bazel sync`. See it working at [1]. Note: there's an open (and somewhat
long standing) issue WRT Bazel fetching from gitlab.

[0]:
[https://github.com/fenollp/bazel_upgradable](https://github.com/fenollp/bazel_upgradable)
[1]:
[https://github.com/voidstarHQ/voidstar/blob/master/WORKSPACE](https://github.com/voidstarHQ/voidstar/blob/master/WORKSPACE)

------
squid_demon
Man, I still just use a simple batch file. These build systems (and
dependencies they are built to handle) are insane.

~~~
drainyard
A lot of the time I see these build systems try to solve a few issues:

1\. Dependency handling

2\. Single script for multiple platforms

3\. Easy for a new user to just "run"

But I have yet to find a build system (for C/C++) that solves all these issues
better than simply having a few scripts.

I use just a build.batch, linux_build.sh and osx_build.sh usually and I have
never had problems.

A new user can just download those scripts, and as long as they haven't
tampered with file/library locations it just works.

If people like their build systems - fine by me - but I just think they are
unnecessarily complex a lot of the time. Granted I don't have a huge project
with 100's of dependencies, but even then, those dependencies don't come all
at once.

~~~
bluGill
Problem is I do tamper with those. I often cross compile software (my day job
is embedded linux). Different distributions like to put things in different
locations for their own reasons. I have a multi-core computer and I'd like to
use more than one of them so my build goes faster.

As a developer I don't want to have to worry about how to make all of the
above work. I learned cmake and so I don't have to worry, now that I know it
(the learning curve is there) almost all my issues go away because it just
works with all the weird stuff people do.

------
outsomnia
It sounds more like the problem is autoconf.

Replacing autoconf with cmake or a similar metabuilder will provide a lot more
flexibility to build on different platforms, including, eg, windows, and
cross-build.

If it's easier to build on any platform, you don't need to try to get into the
custom packager business and just focus on your library.

~~~
wirrbel
I did my fair share of setting up C and C++ builds in the past and I can tell
you: The difference between CMake and Autoconf is marginal, while CMake is of
course a bit more versatile in terms of target build-systems, and of course
its DSL is not as obscure as autotools toolchain, its no meaningful
difference.

What Bazel (or Buck) bring to the table is slightly different, its better
control of inter-module dependencies and external dependencies. In a way they
are not only build tools but opinionated and effective 'linkers', bundling
together your applications and libraries in a way that it is hard to mess up
and very reproducible.

The other option would be the nix/guix route, which I think might be better-
suited for the open-source model but require you to adapt at least the package
managers.

~~~
Teknoman117
The meaningful difference for me is that I work in a place where most of the
developers love their IDEs. CMake is literally the only cross platform build
system with first class IDE support on both Windows and Linux. If you use VS
Code on macOS, you get it there as well.

~~~
JamesSwift
After digging into Bazel a bit recently, I really hope they invest significant
energy into the IDE story next. That is a huge, gaping hole that is being not-
really solved in all the language/IDE ecosystems and is a _massive_ roadblock
to adoption.

IDE integration would remove 90% of my hesitation to push it at my company.

~~~
vlovich123
It’s not much work. Internally at Google they have Idea integration. The
challenge for open sourcing it may be that the plugin is also responsible for
Google3 integration if I recall correctly.

~~~
JamesSwift
"not much work" is absolutely not the feeling I have about it based on some
conference talks about the topic (e.g. the various xcode talks, or the Wix
talks on intelliJ), and my own thought experiments on getting it tied into
Visual Studio / Visual Studio for Mac.

I think the Bazel team introduced Aspects and punted the rest for a later
date. Now they hope each language/IDE community will take that infrastructure
and build out the rest of the ecosystem. I think its too important of a piece
of the puzzle to leave to the community.

------
pjmlp
In what concerns Windows, cmake + vcpkg is getting pretty sweet, now even
binary dependencies are finally properly available, catching up with conan.

~~~
samsaga2
Now I just finished three days of work to remove vcpkg.

Windows + vcpkg + MSVC it works perfectly. Windows + vcpkg + clang it is a
knightmare of linking problems.

~~~
pjmlp
clang, and most likely not the version packaged by Microsoft on Visual Studio,
that is the problem right there.

~~~
flohofwoe
I didn't encounter any problems with vanilla clang on Windows for building
projects (installed with "scoop install llvm"), in fact I was surprised that
it worked out of the box, even for code that extensively uses Windows APIs). I
suspect that it requires a MSVC toolchain installation, or at least the
Windows SDK installed, but since everything "just worked" I didn't dive too
deeply into what's actually happening under the hood.

------
andrewshadura
Bazel is terrible in terms of bootstrapping it, its dependency tree is huge,
and it's a great pain for distributions to package.

~~~
steeve
Had to build bazel from scratch, you need:

\- a previous version of Bazel

\- GCC/Clang

\- JDK

\- Python

\- Zip and Unzip

Not sure I'd call that "huge". Perhaps the JDK?

~~~
de_watcher
He was talking about bootstrapping. But anyway, JDK and Python? Really?

~~~
laurentlb
Bazel is written in C++ and Java. The compile.sh (and the root of the
repository) uses bash and probably a few other binaries. What are the issues
exactly? Please file a bug if you have issues bootstrapping Bazel (it's not a
need for most users, so the process might not be perfect).

------
steeve
I'm glad to see this. What people first and foremost forget is Bazel is a
package manager/overlay in itself (and a very good one too), which allows for
very complex (and yet safe and reproducible) builds.

The learning curve is steep though, but the payoff is so worth it.

~~~
jnxx
My impression is that it can end up with your requirements spread over half of
the Internet with very little control which code runs during your build.

~~~
klodolph
Depends on how you do it. It's designed first and foremost to work with
vendored dependencies in a monorepo, and the external repository support is
newer.

Bazel won't pull in transitive dependencies unless you ask it to in the
WORKSPACE file, which means that you can use a private mirror for all your
dependencies if you like, which is fairly easy in practice (I do it for
personal projects).

The experience will vary depending on your preferences and the languages you
use. With Go + Bazel, my experience is that the Bazel version will have
_fewer_ dependencies than the equivalent "go mod" version, because "go mod"
will pull in dependencies more coarsely and Bazel has more fine-grained
control. Go mod will pull in dependencies for the entire repo that you depend
on, but with Bazel, you only need dependencies for the individual subpackages.
As a specific example... suppose you use github.com/jackc/pgx. With "go mod"
you end up with "github.com/sirupsen/logrus" in your transitive dependencies,
but with Bazel you don't, unless you use the part of pgx that requires logrus.

You can also end up with "half the internet" in your dependencies pretty
easily, but I think these days that is just the price of including random
libraries in your project.

As far as I can tell, the approach to avoid depending on half the internet is
to be very conservative about your dependencies, but that is true regardless
of the system you use to manage them. Just to pick on the JavaScript
ecosystem, if you start a new JavaScript project and pull in TypeScript,
Rollup, and Terser, you'll end up with a very SMALL lock file because each of
these libraries have a very small number of carefully chosen dependencies.

~~~
jnxx
> It's designed first and foremost to work with vendored dependencies in a
> monorepo

So how is that suitable for distributed open source projects? I understand
this is how Google works, but open source?

------
doonesbury
I really wanted Bazel to work for GO (not even C++). I gave up. It's got fine
goals --- really nice ones --- but the doc sucks. We get stuck in crummy alley
ways between corners of Bazel that does work well.

------
de_watcher
Just let the distributions maintainers do their job. That's how you get a
quality user-centered software distribution.

~~~
thundergolfer
It's probably because I'm not very familiar with C/C++ library distribution,
but what exactly does this mean, in the context of this post?

> Just let the distributions maintainers do their job.

Is this sentiment supporting Bazel usage or recommending against it?

~~~
turbinerneiter
In the old school Linux distro world, dependency management is handled by the
package manager. This way has benefits and drawbacks:

* there is specific version of the library associated with the OS that the developer can develop against - if the OS updates, the developer also has to update their apps - so they don't leave old bugs and vulns open

* there is specific version of the library associated with the OS that the developer has to develop against - if the OS updates, the developer also has to update their apps - which sometimes can be very annoying if the API breaks

There is more to it, but this is the thing that constantly has me changing my
opinion on the matter.

