
Show HN: Buckaroo – A decentralized C++ package manager - entelechy
https://github.com/LoopPerfect/buckaroo
======
mempko
Uncool guys, you have hidden telemetry. I don't see anything in the docs about
this.

    
    
        https://github.com/LoopPerfect/buckaroo/blob/2714cd4c9e20235c4a090d21d0f60a0a9f17e82f/buckaroo-cli/Program.fs#L9-L13
        https://github.com/LoopPerfect/buckaroo/blob/master/buckaroo/Telemetry.fs
    

This should be opt-in not opt-out or be clearly mentioned in the docs how to
turn it off.

~~~
entelechy
Thanks for pointing this out, we mention it now in multiple places:

[https://github.com/LoopPerfect/buckaroo/wiki/Installation#te...](https://github.com/LoopPerfect/buckaroo/wiki/Installation#telemetry)

[https://github.com/LoopPerfect/buckaroo/wiki/Telemetry](https://github.com/LoopPerfect/buckaroo/wiki/Telemetry)

We gather this data so we can improve Buckaroo and it's ecosystem.

~~~
mempko
Nice quick update !

------
Firecracker
Counteracting some of the comments below:

My experience using buckaroo in a large project over the past year has been
extremely positive, and I'd say addresses most of the problems raised here.
It's far and away better than anything else I've used.

~~~
DoctorOetker
I don't read any content in your comment?

Your experience (using buckaroo ...) adresses most of the problems raised
here? You could perhaps share at least how your experience adressess some of
these problems raised here: how does your experience address the hidden
telemetry for example?

------
otabdeveloper2
I think Nix is the final boss of language-specific package managers and where
we'll eventually converge.

~~~
ferdek
I've made an account just to upvote this.

I've finished reading manual on Nix and just started with documentation on
packaging. I want to have total control on dependencies, together with
compiler versions and standard library implementations (effectively cross
compiling everything x86_64 -> x86_64). From Buckaroo, Conan and Nix, only Nix
gets this right...

~~~
adgasf
Buckaroo and Nix serve different needs. You can use Buckaroo in Nix
[https://github.com/LoopPerfect/buckaroo/wiki/FAQ#nix](https://github.com/LoopPerfect/buckaroo/wiki/FAQ#nix)

------
IshKebab
So many people have tried and failed to do this. I'm skeptical that these guys
have succeeded.

The main problem seems to be that because there is no standard C++ build
system, every library uses a different one. Wrangling all those together
without requiring the library authors to do anything is extremely difficult.

Hell many C++ libraries still use autotools or hand-written Makefiles. I don't
really see how it is even possible to automatically download and build those
projects.

~~~
shin_lao
It's indeed going to be very, very, hard to retrofit a package manager into
C++.

You could structure something around CMake.

One immediate issue is compilation flags and platforms management. You may
need to use some custom flags, for example, you need to compile against a
specific architecture. You want those flags to be passed to all libraries
uniformly.

But the horrible problem is quickly the compatibility matrix.

In other words, if you are making available, let's say you are using alib v1
and blib v2, you need to check the two libraries compile, link, and work
together nicely. So when you upgrade one of the libraries, you need to check
that compatibility again.

It's an important problem because if you're working on production software,
you need to be able to build all versions and maintain them. And maybe you
will have to upgrade only one library. You can't always be "current"
everywhere.

So very quickly, using a package manager becomes more painful than having the
libraries management manually with a couple of custom scripts.

I'll be happily be proven wrong by a package manager (and we tried a lot).

~~~
dcuthbertson
I can't say I'd be a fan of a CMake based package manager. While CMake has
been a great tool under Linux environments (I've used it for C/C++ builds, and
in mixed C & Go build environments), I've also experienced horrors when CMake
is used to generate Visual Studio solution and project files.

I mostly work in Windows development. CMake scripts created by a professional
build engineer caused a lot of havoc. It injected itself into the project
files in such an insidious fashion, that it was nearly impossible to tell if
dependencies were correct. It completely broke Intellisense, making code
navigation an exercise in frustration.

I was not happy, and wound up ripping out the scripts and creating new
solution and project files that actually supported the needs of the
developers.

(Edited to remove typos and improve grammar)

------
jdright
Let's see the docs what is needed to use this:

> just one file, download here...

Later on:

> You'll need Buck on your system

Then, on Buck's site:

> Buck requirements: Java 8, Apache Ant, Python 2.7, ...

Well, that does not seems too honest, would be best to warn upfront about all
these dependencies.

~~~
adgasf
You can install Buck as a single file also.

------
c-smile
I just want C/C++ to support `include source` statement.

So in order to include some library I'll put

    
    
       #include source "libpng/png.c";
    

in my main.c file. And that png.c may look as:

    
    
       #include source "png-a.c"
       #include source "png-b.c"
       #ifdef PNG_FEATURE_C
         #include source "png-b.c"
       #endif
       ...
     

And that would be it.

C/C++ preprocessor is enough for configuration.

~~~
MereInterest
Could you verify what you mean by an include source statement? As you describe
its usage, it seems identical to the already existing #include statement,
which does a text inclusion of another file.

~~~
c-smile
#include "path.c"; inserts path.c into _current_ compilation unit.

#include source "path.c"; compiles and includes path.c as a new compilation
unit (
[https://en.wikipedia.org/wiki/Single_Compilation_Unit](https://en.wikipedia.org/wiki/Single_Compilation_Unit)
) .

So if a-file.c and b-file.c are both have

    
    
        static int foo = 42;
    

then these two files can be successfully included as

    
    
        #include source "a-file.c" 
        #include source "b-file.c"
    

But this:

    
    
        #include "a-file.c" 
        #include "b-file.c"
    

will produce the error "foo is already defined"

This small feature will allow 99% of existing libraries to be included this
way - without any makefile, GUP, GIP and the rest of the zoo.

Now I can put in Microsoft VC++ this

    
    
        #pragma comment(lib, "some.lib") 
    

to include the library into final binary.

#include source is just a generalization of the idea.

~~~
fwip
And if b-file.c says "static int foo = 43;" what is the desired behavior?

~~~
ghthor
I'd imagine a compiler error that static values, aka globals, must not differ
at compile time. This would likely apply to consts as well.

~~~
c-smile
"must not differ at compile time"

not so, static declaration are local for the compilation unit so you may have
as many different foo's as files you have in your project.

------
bradhe
Another C++ package manager?

------
DoctorOetker
perhaps we don't as much need a specific package manager implementation as
standard(s) for package maintainers so different package manager
implementations can ingest the compliant packages?

------
captan
How can I set it up

------
TechHuntersio
This is awesome!

------
simfoo

        The Buckaroo workflow looks like this:
        
        # Create your project file
        $ buckaroo init
        
        # Install dependencies
        $ buckaroo add github.com/buckaroo-pm/boost-thread@branch=master
        
        # Run your code
        $ buck run :my-app
    

And yet another development tool that wants to hijack my workflow. No thanks

Conan.io already does decentralization right and is well on the way on
becoming the defacto standard package manager for C++. And all of that without
constraining my workflows

~~~
rienbdj
Conan requires a server so it is not decentralised (maybe things have changed
since I last checked).

~~~
simfoo
It does not require a server. It supports arbitrary remotes (like git does) to
exchange packages, but it works without one as well (just publishing packages
to the local cache)

~~~
entelechy
How do I install a package from git directly? - and can that package depend on
another package that lives in another git repository?

Afaik this is not supported by conan

~~~
simfoo
It is supported. You create the packages using "conan create" from any place
with a recipe, which puts the packages into your local conan data directory
(called the cache).

Any other recipe can then depend on any package that is already in the cache.

~~~
adgasf
Does this not assume the package is already on your system?

~~~
simfoo
No, conan create builds a package from a recipe. Provided you cloned sources
form a git remote, you can locally build the conan package without conan ever
hitting your network.

You can even just export any recipes into your local cache without manually
building the package and then let conan do the work when you require the
package as a dependency (using --build all)

~~~
adgasf
I didn't realize that a recipe was distinct from a package. In most systems
there is only one concept: packages.

So to see if I have this right:

With Conan, to use a package from GitHub, I must download the recipe and add
it to my local cache ("cache" seems like the wrong term here, maybe "registry"
would be better?) and then I can install it?

What about the transitive dependencies? Do I need to repeat the process for
each of those?

~~~
simfoo
There are no conan packages on Github, usually you can find recipes along (or
separate) the sources.

Yeah, the term conan "cache" may be a bit misleading, as it is not only a
cache but really your "local remote". As soon as a recipe is in your cache,
you can depend on it from your own recipes/conan builds. The actual building
of the package may happen during the registration of the recipe or during
resolve time when you depend on a certain package configuration. Any
transitive dependencies will be resolved and, in case of --build missing, the
packages will be built automatically according to the recipe.

So no, you do not need to build any dependency packages (including transitive
dependencies)

~~~
entelechy
But all those dependency packages will be resolved against a conan server not
against the origin eg. github. Furthermore I'd need to put every version of
the package into the conan cache.

This is impractical if a complex dependency graph as it would require the user
to solve a SAT problem just to determine which packages and versions to put
into the conan cache to avoid pulling from the conan server.

