
So you want to write a package manager - sdboyer
https://medium.com/@sdboyer/so-you-want-to-write-a-package-manager-4ae9c17d9527#.740o43vxi
======
davexunit
This article describes the status quo of package managers. Lock files,
dependency resolution algorithms, etc. However, these are obsolete. The
functional package management paradigm approaches the problem from a different
angle which is able to _precisely_ describe the full dependency graph for a
piece of software, all the down to the bootstrap binaries (the C compiler's
compiler, etc.). In this system there is no dependency resolution, because
each package is explicit in what it depends on. A dependency is much more than
a version number. The novel idea in functional package management is that a
package can be viewed as a pure function whose arguments are the dependencies,
the source code, the build script, etc. and whose output is the built binary.
By viewing packaging from this perspective, functional package managers
overcome the problems that plague traditional package managers, allowing for
unprivileged package management, transactional upgrades and roll backs,
multiple variants of a package coexisting without conflict, etc. One such
implementation of functional package management, the one I hack on and
recommend checking out, is GNU Guix.

[https://gnu.org/software/guix/manual/html_node/Introduction....](https://gnu.org/software/guix/manual/html_node/Introduction.html#Introduction)

~~~
tgamblin
> The novel idea in functional package management is that a package can be
> viewed as a pure function whose arguments are the dependencies, the source
> code, the build script, etc. and whose output is the built binary.

And this is an awesome idea. Cryptographically hashing your install and
keeping the install spec around for later is very powerful.

> Lock files, dependency resolution algorithms, etc. However, these are
> obsolete.

These are still very important (and hard) problems. Claiming that they’re
obsolete ignores what they’re used for.

> The functional package management paradigm approaches the problem from a
> different angle which is able to precisely describe the full dependency
> graph for a piece of software, all the down to the bootstrap binaries (the C
> compiler's compiler, etc.).

I suppose this is fine from the perspective of a system package manager. But
at the application development level, there are not very many users who want
to specify all that.

If you want to play around with different software stacks in Guix and Nix, you
have to actually write packages, which involves tweaking a lot of files. There
aren’t so many people with the intestinal fortitude to get down to that level.
I want a system where I can say “try running with a different
version/build/configuration of this particular dependency”, and then the PM
figures out what _else_ needs to be done to accommodate that. Users don’t want
to tweak package files a lot, and even less do they want to manage the
profusion of version/configuration-specific package files that result from
doing such experimentation.

We’ve developed Spack as kind of a compromise between these extremes. Spack
[1] has builds parameterized by version, compiler, build variants, compiler
flags (soon), etc. and it attempts to let the _user_ experiment with a large
combinatorial space of packages. You can do that with a command-line syntax
(without editing package files), and you can specify tweaks down to the
dependencies with a recursive syntax [2].

I support code teams at LLNL who want to experiment with many different
compiler flags, dependency versions, etc., across Blue Gene/Q, Cray, Linux,
GPU, and Xeon Phi machines, and none of them want to specify everything quite
as rigorously as Nix/Guix demand. What we want is really good build
parameterization and a concise way to specify it. I don’t see many Guix/Nix
builds written that way — they’re all tied to specific versions and you’d need
a new package file to support a new version.

> functional package managers overcome the problems that plague traditional
> package managers, allowing for unprivileged package management,
> transactional upgrades and roll backs, multiple variants of a package
> coexisting without conflict, etc.

That’s one way to look at it. The other is that these systems _ignore_ these
problems. What you’re really offering is a really good reproducible build
system, at the price of rigorous specification. Reproducibility is a big deal,
and there are great reasons to do this, but you can’t say the other systems
are “obsolete” when they have very different goals.

What Nix/Guix are _not_ doing is reducing the burden of specification. You
write down all the details of a very specific, reproducible software stack,
but you do not make that software much more composable or extensible than it
already was. The user can install the version combination that you packaged,
but can they easily try their own? Can they easily try to build with a
different compiler/compiler version/set of compiler flags/dependency
version/etc.?

npm, pip, etc. hide a lot of details from the user, and that is why people
likely continue to use them (i.e., they are not “obsolete”). Constraint
solving, etc., are still necessary to hide a lot of the complexity that users
don’t want to deal with. Nix and Guix are great for reproducing a snapshot,
but app devs want to explore the build space more than that.

Spack attempts to find a happy medium. We cryptographically hash builds, but
we also let the user build things that the original package author may not
have tried yet, _without_ modifying the package. That _does_ require all the
constraint solving nastiness, but it doesn’t kill reproducibility. Spack
stores the generated build provenance _after_ doing some constraint solving
[3], but the _tool_ helps fill in the missing details of the dependency graph,
not the human. Nix and Guix, AFAIK, do not do that. Spack isn’t fully
reproducible because it’s not a full system package manager and it doesn’t use
chroot, but you _can_ try the same build again using the provenance that was
generated when all the solving was done. The tool just helped the user along
the way, and I think that’s still a very useful (not obsolete) thing.

[1]
[https://www.computer.org/csdl/proceedings/sc/2015/3723/00/28...](https://www.computer.org/csdl/proceedings/sc/2015/3723/00/2807623.pdf)

[2] Slides 9-11: [https://tgamblin.github.io/files/Gamblin-Spack-
SC15-Talk.pdf](https://tgamblin.github.io/files/Gamblin-Spack-SC15-Talk.pdf)

[3] Slide 14: [https://tgamblin.github.io/files/Gamblin-Spack-
SC15-Talk.pdf](https://tgamblin.github.io/files/Gamblin-Spack-SC15-Talk.pdf)

~~~
rekado
> What Nix/Guix are not doing is reducing the burden of specification. You
> write down all the details of a very specific, reproducible software stack,
> but you do not make that software much more composable or extensible than it
> already was.

This is not so. You do not have to write down _all_ the details of the
complete software stack as build systems can be abstracted away (e.g. `gnu-
build-system` provides the GCC toolchain, binutils, make, etc). Only immediate
first-level dependencies are declared.

And in many cases you don't have to do even that because you can generate
package expressions from upstream information using `guix import`. There are
importers for CRAN, bioconductor, pypi, hackage, CPAN, and others.

> The user can install the version combination that you packaged, but can they
> easily try their own? Can they easily try to build with a different
> compiler/compiler version/set of compiler flags/dependency version/etc.?

Yes! We use Guix for multiple clusters at a research institute and of course
users must be able to create package variants, with different configure flags
or compiler versions. This use-case is covered very well by Guix.

~~~
tgamblin
My point is mainly that the package specification in Guix is very verbose -- I
must hack (and in some cases generate) package files to do what I want to do.
The approaches you mention for generating package variants seem to generate
new packages -- how do you deal with the profusion of packages that result
here?

Don't get me wrong -- Nix is the inspiration for a lot of what Spack is doing,
but we've added to it by making an attempt to template the packages by
version, compiler, and options. So the usr don't have to "create" the package
variant at all: they just `spack install numpy %intel@15.0.1` to compile with
the Intel compiler, because the compiler can be swapped in/out of any build.
We do similar things with versioned interfaces like MPI -- you can swap
OpenMPI or MVAPICH in/out of a build. I have not seen anything to lead me to
believe Guix allows this in a simple way, without generating a new package and
copying a lot of boilerplate from the old one. The graph transformation stuff
you mentioned in your other comment is promising, though.

------
tadfisher
So many useful tools are packaged in language-specific package management
schemes, and they are a pain to deal with in conjunction with a system package
manager. I'm speaking of nice command-line applications that work like Unix
utilities, that simply parse STDIN and write to STDOUT, but for some reason I
have to install PyPI/NPM/gem/whatever to install them, and figure out what
language-specific idiosyncrasies I have to massage to get them to work (venv,
rvm, and whatnot).

Like the author, I want language implementers to split the packaging realm:
anything that's a pure source dependency can and should be packaged with
language-specific tools, but anything that runs on the system _should_ be
distributed in some sort of universally executable format, such as a launcher
shell script with an application directory in /usr/bin.

Otherwise you're in this current mess where anyone writing a system package
manager has to deal with language runtimes and fight with the language-
specific package manager to handle
distribution/installation/removal/rollback/versioning in a coherent way with
the rest of their platform. Moreover, the user has to make the decision
between installing a package from their system vendor or from the language-
specific repository.

So I guess what I'm advocating for is for the "language package manager (LPM)"
concept to go away. If you need to distribute source dependencies, that should
be handled on a per-project and not per-language-environment or per-system
basis. If you need runtime dependencies, distribute them as a tarball in a
standard directory format and let system vendors package them like the rest of
the software they manage.

The JVM world has good things going with Maven and Gradle. I never have to use
some Java command-line package manager to install a global dependency; rather,
I'm declaring what my project needs and letting the build tools handle the
rest. I never have to deal with conflicting dependency versions between
projects or language versions. And binary Java applications are easily
available via my system's package manager. There is just no need for the
middle layer.

~~~
drewcrawford
The problem is that, as a software developer, I am not going to learn 3 PMs to
distribute my free software (yum, apt, homebrew) nor am I going to navigate
the politics of Fedora, Debian, Ubuntu, etc. for getting my free software
included in their repositories and/or keeping it current. I have things to do,
like write software.

Language PMs are "good enough" to solve the problems of the people actually
writing the software, and in most cases that is really all that matters. I'm
sorry they're not good enough for you, but it's not my fault that Rust,
Python, and Go will let me upload to their package manager but
Debian/Fedora/whatever you use will not.

Your fight is not with the language PMs, it is with your OS vendor who will
neither give me upload rights nor send out someone else who does, who neither
designs tools for me to use nor sends someone who already knows how to use
them.

To be clear, I think there are good reasons why OS vendors won't change their
ways, but software developers won't change our ways either, so neither
language PMs nor OS PMs are going away any time soon.

~~~
snuxoll
> Your fight is not with the language PMs, it is with your OS vendor who will
> neither give me upload rights nor send out someone else who does, who
> neither designs tools for me to use nor sends someone who already knows how
> to use them.

As far as Fedora is concerned becoming a package maintainer is a fairly
straightforward process, and you are more than welcome to package up your own
software and submit it to the software collection or solicit someone to assist
you with it. The rpmdevtools package contains a super-useful utility called
rpmdev-newspec that provides templates to create packages for most popular
standards out there already (ruby gems, python setuptools/distutils scripts,
CPAN modules, etc) - it's really easy to get started and you can always push
your packages up to COPR without needing sponsorship from the existing
packaging team.

~~~
yxhuvud
This is missing the point. The problem is still that there is a million
different system PMs, but (usually) only one for each language. The package
maintainer have no incentive to care about Fedora, regardless of how easy it
is. The real solution is not for Fedora to absorb all language specific
packages but to integrate the different package managers so that they work
better together.

~~~
snuxoll
> The real solution is not for Fedora to absorb all language specific packages
> but to integrate the different package managers so that they work better
> together.

Which is precisely why rpmdev-newspec has templates for this case. Simply
running rpmdev-newspec -t ruby/python/perl/php-pear/ocaml/R mypackagename will
generate a rpmspec file ready to go for any of these languages using their
standard package managers (gem, distutils/setuptools, CPAN, pear, OPAM,
packrat) - just add necessary metadata (Requires/BuildRequires, summary,
description, version, and a changelog).

It's not magic, it still requires minimal effort (3 minutes worth of metadata
+ calling mock to test the build) - but it's hardly difficult. Sure, you could
make some extra scripts to automatically populate the metadata from the
language PM's descriptor of choice to limit the manual work to release bumps
and editing the changelog - and I don't think anyone would be opposed to that,
but that's pretty much all that would be left.

The biggest problem is outside of rolling-release distributions like Arch,
Gentoo, OpenSUSE Tumbleweed and Fedora Rawhide bumps in major versions within
a release are highly discouraged - and a lot of people used to using language
PM's have a habit of changing things rapidly even post-1.0 and not supporting
old branches - often leaving the distribution packager with needing to
backport patches for security or bug fixes. I'm not going to argue if it's
better or worse, but it's an issue that needs to be addressed.

------
krick
> OS/system package manager (SPM): this is not why we are here today

Arguably, this might be already a wrong approach. Whatever you say, but this
_is_ a problem (and _is causing_ problems) that I can do both `apt-get install
python-mutagen` and `pip install mutagen`. But what I cannot do for sure at
this point: get rid of one and only one of these tools. So before speaking
about how can I fuck up writing some language-specific PM it might be fair to
admit that pretty much nothing actually _is_ language or project-specific when
talking about computer systems, and consider why somebody might write another
such tool when we're done with our task.

~~~
aikah
The difference is obvious. pip works on all os/platforms. apt-get certainly
doesn't.

If I write a language and ask users to use their favorite package manager to
fetch dependencies, I'm not going to build a big community because packages
and their dependencies will not be shared across platforms, or each user will
have to add some manifest for every package manager, or different repositories
will have to be maintained for each platform ...

os package managers and language package managers ARE orthogonal.

~~~
krick
Except they aren't. Not even remotely. The package for some scripting language
like python or PHP might have binaries it's depending on, or might have a code
written in another language. Pretty much any project-wise language-specific
dependency manager can include something that is not actually a library or a
bunch of source code, but an _utility_ exposed globally (like unit-testing
tool or framework code generator). There are dozens of projects which started
out as a domain-specific library (and thus subject for being operated via PDM
or LPM _only_ ) and soon end up being used as an utility, useful for end-user
(because difference between a library and a tool here was a 50 lines of code,
exposing some existing methods and which we would like to have anyway for
testing). Such tool then is installed via several package managers (and
additionally from the source code, because github is far ahead of your PM
repository, yeah…) and can cause version incompatibility problems or something
even more obscure. And that's just from the top of my head: there can be
hundreds of cases which I did not remember right now or even think of, ever.
And neither have you.

In fact, pretending that these are _different things_ and that "developer" is
something drastically different from "user" is precisely the reason, why all
(or, well, _most_ ) package managers in existance are so fucked up. The hard
truth we must accept before we talk about package managers is that there's no
hard lines in this domain. It's a complex problem and must be dealt with
accordingly, without exceptional reductionism.

------
reality_czech
Since Go is the language he brings up in the writeup, I have to ask: honestly,
why does Go need a package manager?

Go didn't support shared libraries until very recently, and still doesn't use
them for things other than where runtime pluggability is actually required. If
I want to install a Go program without compiling it, I just use my favorite
package management tool-- the cp command.

Meanwhile, Linux package managers have acceptable versions of many of the Go
programs I want to use. And building from source just uses git to pull the
dependencies it needs. Not every programming language needs to reinvent the
wheel, poorly. This looks like a solution looking for a problem.

~~~
aikah
> why does Go need a package manager

Because a language is nothing without an ecosystem of libraries. These
libraries need to be version-ed and dependencies between these libs need to be
resolved.

> Go didn't support shared libraries until very recently

It still doesn't.

> If I want to install a Go program without compiling it, I just use my
> favorite package management tool-- the cp command.

That's not the issue at end here. If i'm a developer and I need to publish an
open source project, i'm not going to publish it with all its dependencies. I
need a mechanism to manage them, that's what a package manager is for.

Go has an half baked on, go get. It can fetch dependencies but it cannot tell
between versions of the dependencies.

It's interesting to note that a lot of Go developers are actually against a
separate package manager.

~~~
sdboyer
A community, gaslit by years of poor tooling, IMO

------
kazinator
Not only don't I want to write one, I don't really want to use one.

Especially as a user who isn't _programming_ in the given language. Don't make
me install some package manager for your language before I can run your
program.

Self-contained executable or installer, please, or go back to the wood shed.

~~~
davexunit
>Self-contained executable or installer, please, or go back to the wood shed.

Language package managers suck, but this is way worse. Self-contained
executables bundle all of their dependencies, which is terrible for both
reproducibility and security. Users and system administrators have no
reasonable way of identifying, patching, and updating vulnerable
libraries/programs these things, leaving them dependent on the upstream
providers of each binary bundle for a bunch software they didn't even write.
I've seen a growing push to "appify" GNU/Linux lately, with Docker, xdg-app,
OmniBus, etc. and I have become very worried about what will happen if this
becomes the dominant way of distributing software on GNU/Linux.

Package management is good for users. We need a safe and sane way to
_maintain_ the software running on our computers. The control ought to be in
the hands of the user, not the developer.

~~~
kazinator
> _Self-contained executables bundle all of their dependencies, which is
> terrible for both reproducibility and security._

That is only a religious belief. If two programs share a common library, an
upgrade to that library (say security fix) could fix things for one program,
but introduce a hole into the other. Every upgrade to some shared piece must
be validated by the developers and QA of every program that uses it. 99
programs could be fine the change, and the hundredth could break.

And if you want utmost reproducibility, then you in fact need a given version
of a program to have its exact dependencies, so that you're running exactly
what the developers are running. If program X needs libfoo.1.2, and program Y
needs libfoo.1.3, and the programs are actually bundled with their specific
version of libfoo, then you have better reproducibility than if libfoo.1.3 is
foisted upon program X because program Y requested that version.

The model where you have one libfoo only works if everything is open source
and packaged by an upstream distribution, which takes care of curating the
entire combination of stuff, so that when program Y needs libfoo.1.3, the
entire distro is officially pushed forward to that libfoo version; it becomes
the official libfoo for program X also. What you have matches the upstream and
so behaviors are likely to be reproducible. If the vendors for different
programs are completely independent, then you in fact sometimes need multiple
versions of dependencies.

~~~
nextos
This is why we need stuff like Nix or Guix, where you get the best of both
worlds. Packages can depend on different versions of libraries, there is still
good accountability of what each package depends on (critical to fix urgent
security bugs), if same library is used it is shared, things can be easily
sandboxed, and they are also _reproducible_.

------
scrollaway
Reposting my comment from the other day since it's something I've been
struggling with for years. I'm halfway through the article only so maybe I'm
repeating what's being said but...

    
    
      I really wish we'd get generic package management.
      A proper package management server and protocol that is.
      Something that can be used and picked up by Windows and linux
      distros but also by domain-specific packages (pypi, etc), browser
      extensions, games with addon repositories and so on.
    
      It's sad that right now we can't answer "and now we need package
      management" by "sure, let me fire up pkgserve.d" or something.
    

I think a package management server would be a good first step. Package
managers (clients) themselves will inevitably be numerous as different use
cases call for different UIs. Android wants a pretty user-facing store. Python
wants something that can interact with the python environment at a low level.
Debian wants a million checks and firefox wants to integrate with the browser
anyway.

~~~
sdboyer
Well, spoiler then: I don't advocate for generic package management. That's
someone else's problem. Maybe, if the ideas I've put forth in there lay useful
foundations, it's something that could happen, but I very much doubt it's
practicable.

Even if it does happen, though, I suspect that end-user package management
still ends up looking different from developer package management.

~~~
scrollaway
I agree. I don't think "generic package management" is possible at all. I
think it's possible to make it _more_ generic than it is now (namely by having
similar domains agree on single package managers), but these issues are far
less relevant.

What is possible I think is to have generic package management _servers_ and
_protocols_. The same way you have HTTP servers right now and don't just roll
your own domain-specific protocol for your own domain-specific web just
because it's in go instead of python or whatever. And yes, you end up with
multiple web _browsers_ and that is _fine_. You even end up with domain-
specific browsers (eg. embedded in games, or as a library, or "live apps" and
what have you) and that's all fine.

What I see, whenever I see a new package manager, is a colossal waste of time
that could have been avoided.

------
rixed
I can't help but think that if 10 years ago submitting Debian packages has
been made easier rather than harder then we would not have this ugly mess
today (and Debian would be in a better shape, too). As a software author all I
want is to provide others with a simple way to install my stuff. I'd like to
write a simple deb spec and tell "just apt-get it". And the fact that not
everybody is running Debian is not really a concern since packages can be
automatically converted most of the time. But that is not possible, not
because writing deb specifications is hard, but because to push any new thing
in Debian requires to go though a "becoming an official Debian maintainer"
process involving lots of steps including convincing some people that you and
your software are worth it, check of ID, etc. I do not want to become an
official anything, so I just git push to whatever smaller package manager
that's used in that community.

------
sdboyer
As long as I briefly have HN's attention - how did y'all find the repo-as-
universe, and universe alignment, metaphors? Useful?

(This was the diagram leading up to it:
[http://imgur.com/bzy22DA](http://imgur.com/bzy22DA))

~~~
skybrian
I liked it. Nice article. It makes me happy about Dart's design decisions in
this area. Hopefully something nice happens for Go.

In the hazy future, one thing I wonder about is how to do back-pressure
outside the monorepo: if upstream makes a mistake and breaks something
important downstream, how quickly do they find out? Today this happens
informally. Ideally (from a responsible maintainer's point of view) you'd have
an easy way of finding out before publishing a new version, and the second-
best approach would be automatic notification soon after. This might involve
speculatively compiling and testing some downstream packages to find out how
bad the breakage is, and probably involves coordinating continuous builds
somehow, or perhaps making continuous builds the job of the central repo
(making it even more centralized).

------
revelation
If you are going to build a package manager, can I humbly suggest to take the
learnings of late and use a DHT + strong signing of every package?

This practice of forever chaining a language to a bunch of fragile servers out
there that go bad at the rate of links is worrying.

~~~
sdboyer
I considered including some mention of crypto assurances of packages in the
article. I did not, because it would take someone who knows the constraints
dictated by such systems far better than I to come up with a means by which
such mechanisms could be integrated into a PDM...and have people still use it.

Most of the integrity of packaging systems (at the PDM level), derive from the
assurances provided by the underlying VCS - e.g., Git's tamper-proofing
assurances by virtue of how its commit DAG is built. If your system has a
registry in the middle, that does create an SPOF; if that registry
intermediates the VCS with tarballs, that takes away the clients' ability to
rely on the VCS for verification. So, in my naive view, that's the point at
which signing becomes more crucial.

------
zacharypinter
Just wanted to say this was an awesome article. I have no immediate need to
write a package manager, but it was a great overview of an interesting and
tricky domain. Thanks!

------
jokoon
I wish there was some option to quickly install libraries when using C or C++
on windows.

It's just unbelievable that windows doesn't have this yet.

Although the only thing I hate to do, is compile a lib for such version of
MSVC, and add it in my project, meaning I have to add the include path, the
lib path, the lib name, in each debug and release, after having ran CMake on
it.

Windows is still a little cranky.

~~~
Pxtl
Well, ms has nuget now for C# development, and I think it supports C/C++, but
I don't imagine the community is huge.

------
jfb
I enjoyed this a great deal, but one thing that I don't see touched on in any
of these package management posts is the difference between system-level and
user-level package management. Maybe it's because I use a Macintosh, and thus
have never been subject to the joys of apt/yum/pacman et al, but I've never
been comfortable with the idea that I could install some binary or library and
it'd be puked into the system execution context. I use homebrew, which has its
problems no lie, but I can upgrade my own versions of software that might
conflict with system level stuff without fear.

I'd like the distribution to stick to its knitting and leave the decisions
about my personal world of versioning and dependencies to me.

~~~
chriswarbo
> Maybe it's because I use a Macintosh, and thus have never been subject to
> the joys of apt/yum/pacman et al, but I've never been comfortable with the
> idea that I could install some binary or library and it'd be puked into the
> system execution context.

I don't use a Macintosh. Is their OS system not built out of binaries or
libraries? How are those managed, if not by a package manager? I was under the
impression that Apple called their package manager "the App Store".

> I use homebrew, which has its problems no lie, but I can upgrade my own
> versions of software that might conflict with system level stuff without
> fear.

It's true that some systems' package managers don't support separation of each
user's packages from each other, or from system-wide packages. For example,
I've used dpkg and rpm which don't support that. However, some package
managers _can_ handle user-specific installation (e.g. Nix can), so I don't
see how using a combination (e.g. using Nix for user-specific packages, in a
Debian system managed by dpkg) is any different from using homebrew alongside
whatever-the-Macintosh-package-manager-is.

I've found that using the same package manager for both makes life easier
though; e.g. it avoids having two copies of something installed, since one PM
didn't spot that the other had already installed it.

> I'd like the distribution to stick to its knitting and leave the decisions
> about my personal world of versioning and dependencies to me.

Personally, I consider myself to be in charge of my computers, so my personal
decisions about versioning and dependencies apply to both user-specific
packages _and_ the whole system; e.g. if I want to test on different versions
of Python, I should be able to; if I want to have my kernel built with
different compiler flags, I should be able to.

~~~
aikah
> I don't use a Macintosh. Is their OS system not built out of binaries or
> libraries? How are those managed, if not by a package manager? I was under
> the impression that Apple called their package manager "the App Store".

App Store isn't a package manager, this is a store where end users download
applications like Google Play, has a lot of restrictions regarding what can be
published and distributed. One certainly cannot distribute libraries through
the App Store or register alternative repositories. And the App Store doesn't
resolve dependencies or stuff like that.

------
lobster_johnson
This is by the author, or one of the authors, of Glide, which so far is the
best dependency manager for Go that I have come across:
[https://github.com/Masterminds/glide](https://github.com/Masterminds/glide).

~~~
sdboyer
Glad you like glide, and thanks! Though I'm at the very most a co-author -
I've contributed only ideas so far, and maybe a comment fix or two.

Now that I'm finally done writing this, maybe I can write some code.

~~~
favadi
Slightly off topic but last time I tried, I can't add new dependency without
upgrade existing dependencies with glide. Can glide do it now?

------
wslh
One important issue that gave me a headache in the past (at least for NPM) is
the reproducibility factor. You can't reproduce an installation using NPM.

That's because with NPM you can install a specific version of module A on time
t0 that also installs, based on package.json, the latest version of module B.
But on t1 there is a new version of module B, so the new installation of the
same version of module A ends up having a different version of module B.
Obviously, this problem expands dramatically when you use multiple modules
with different degrees of depth.

Hopefully mook pointed about the mention of shrinkwrap in the article.

~~~
davnicwil
Whilst this is true, it's clearly bad practice to have a 'latest' dependency
in any module that is itself designed to be depended upon.

Personally I've never seen this be a problem in practice. Possibly because I'm
lucky, possibly because I've only used dependencies that don't do this. Since
I mostly use quite popular libraries, and check the dependencies of the more
niche ones I decide to use for exactly things like this, I think the latter is
probably the case.

Out of curiosity, I wonder if anyone here has hit this problem in practice
with npm and could provide a concrete example - in particular with popular
open source dependencies?

Yes it's a possible problem in theory, but a lot of problems in software are
possible in theory but rarely occur in practice because people follow sensible
patterns to guard against them.

~~~
streptomycin
I hit it with an unpopular package (like 10 stars on Github). A dependency of
a dependency slightly changed its API in a patch release. Fun debugging. So
glad we have this auto-install-newer-untested-versions feature as default.

------
dalke
With 7 links to the same essay in the last 12 articles, and an aggregate of 24
upvotes, I would have thought someone would have made a comment in at least
one of them by now.

I'm only halfway through it - package managers aren't something I'm used to
dealing with. It makes me wonder though, just how many package managers are
there these days? Even within Python, I've heard of several.

------
anthk
If you use Debian or a derivative, consider pypi-install, cpan2deb and any
wrapper which builds debian packages from $LANG repos.

------
k__
TL;DR stop and use Nix.

SCNR.

~~~
nextos
Or its sister project Guix. I really like some of their ideas, e.g. the guix
challenge command [1].

[1]
[https://www.gnu.org/software/guix/manual/html_node/Invoking-...](https://www.gnu.org/software/guix/manual/html_node/Invoking-
guix-challenge.html)

~~~
iso-8859-1
Sister project? That implies that they are somehow affiliated...

~~~
davexunit
There's overlap in the community/technology, but they are separate projects.
Oftentimes people think that Guix is a fork of Nix, but this isn't so. Fun
fact: Guix was started by a former NixOS developer.

------
yyin
I wrote my own "package manager".

It's nothing more than three short portable shell scripts, using only ftp,
sed, gzip, tar, rm, cd, etc.

Credit to the pkgsrc folks for making things simple enough that this is
possible.

However my usage of "packages" is minimal. I prefer statically compiled
binaries that I compile myself. And I write scripts to automate the fetching,
patching and compiling.

