After all this progress, we still don't have a stable alternative and no clear date as to when it might be ready or even worse, the community is still not sure vgo will be the final one. Golang should be about pragmatic, practical decisions. The packaging situation is just the opposite.
That said, if you ignore the doubts and arguments about the sanity of its dependency resolution algorithm, I think vgo's introduction of modules is its real contribution.
Go packages have been problematic since the start since their design, naively, conflates a bunch of concepts that most developers keep separate: File location, file structure and source repository. That is, when you do 'import "github.com/foo/bar"', there's a whole bunch of conventions at play that dictate where it can be fetched from and where it should be in the file system. Aside from the fact that the $GOPATH convention is maddeningly annoying to many developers, the design has several issues, such as that an entire git repository has to be fetched in order to use a nested package, or that a package can have multiple import paths. But getting rid of $GOPATH is in itself a huge deal.
I'm personally less concerned with the new MVS algorithm (though I personally prefer the more traditional lock file approach such as that used by dep, Cargo, Bundler, NPM, etc.), as long as I can get modules. While getting modules retrofitted into depwithout all the other stuff in the proposal would be lovely, I don't know if the module design is too dependent (heh) on the rest of vgo's versioning thesis to be separated out.
It's just so funny how careful the Go community seems to tread about not breaking their language with all these Go 2 proposals and general resistance to new features — and then completely ignore this spirit to "move fast and break things" when it comes to package management.
I feel the opposite. Lock files are awesome when your developing a package, but after you release it, you are at the mercy of all your dependencies in making sure that they don't break semvar. Each "npm install" will ignore any lock file your package was using during development.
It would be awesome if I could distribute a package-json.lock file with my npm package and have yarn/npm use it when resolving packages on fresh installations.
Yeah, this would mean that there may be duplicated libraries in your node_modules tree, each with a slightly different version, even if semver compat. However, people break semver to often anyway.
I want to deploy a package to npm and be sure that it will never break, ever.
Furthermore, even if I use exact versions in my package.json, that doesn't stop my referenced packages from internally referencing semver-compat versions.
At the very least, you should reference exact versions at the top level. It is better than nothing.
That's true for npm, but not for bundler (luckily).
A Go package is similar to a Java or Python package; it's just a namespace. A vgo module can be thought of as a collection of packages that have a version number and a single canonical name.
I don't find the terms particularly confusing, given that Go doesn't already have anything called a module. It would really only be confusing if you're deep into another language that has "packages" and "modules". I personally don't find it difficult to context switch like that.
Certainly it's not that big of a confusion, but still will probably be one in discussions, especially for polyglot programmers.
Not really because:
- The scope is different. dep is a package management tool. vgo is more than that. vgo intends to fully replace the go command. This is how we deprecate GOPATH.
- The underlying principles are different enough to make it difficult to "improve" on dep. It looked easier to start from scratch.
"Go += Package Versioning", Russ Cox, https://research.swtch.com/vgo-intro
"FAQ: Why is the proposal not “use Dep”?", Russ Cox, https://github.com/golang/go/issues/24301#issuecomment-37122...
That said I agree it's super frustrating, and the vgo announcement just had zero finesse. This isn't a language of experimenters hacking away, there are large production systems that run on solely Go code. Upending the main tool (`go` itself) and the versioning strategy is not to be taken lightly.
> Go is rapidly losing the momentum it had for a while now that Rust is starting to become a more stable target and given Go’s inability to address package versioning.
Here I disagree. Rust is fine for small utilities right now, but if you start building larger apps, developing with futures, tokio, and some of the other async libraries is a nightmare. The APIs are moving really quickly (`futures` hit 0.2, `tokio` is on 0.1 so I don't blame them really) and the ecosystem is still extremely immature. I'm excited for Rust, but it's not yet at the point that I want to build a complex application.
I think Russ did an unusually good job of explaining the ideas around vgo, grounded by using real world examples. He set a high bar there, but hopefully Sam adopts some of the same methods of very concrete examples to explain his criticisms.
As it stands this is an interesting piece of reading but is too high level and abstract in its criticism to be effective. Looking forward to future installments that hopefully that will make Sam's criticisms clear.
From what I can tell one piece of criticism is the inability of vgo to define incompatibilities. That is that if you know these versions won't work, you can say never allow that version in your dependency graph. I'd love to hear more examples of this in the real world. Since vgo dependencies are basically always "pinned" this seems like that not big a deal to me, but maybe I'm missing something. Perhaps this is something that occurs with having two of your dependencies having different dependencies on a third package and one of those works only with firstname.lastname@example.org and the other only works with email@example.com or somesuch.
Maybe there are ways around that or maybe that's not the actual problem (again, hope Sam explains more in the future). I think there is a really interesting element to MSV that will actually drive Go culture though (and I'm a big believer in the importance of the culture of a language) which I think may minimize the problems that Sam is talking about and for the betterment of the community.
As others have noted, I hope that Sam comes out with the rest of this series quickly so that the go community can commit do a direction sooner rather than later and move on to actually building this new ecosystem.
It sort of asserts various things work or don't in the real world, which is strange to me, as a lot of that behavior depends on what incentives/etc exist and these are very emergent systems that change.
It doesn't seem to consider very heavily whether vgo will be successful in changing the way people operate at all (despite this kind of thing happening all the time), and instead asserts based on the current state of the world , and random assertions about how people work, what the future will be, and asserts vgo won't change that in various ways. It's all very minimal actual explanation wrapped in a lot of language.
It's also all random opinion, simply stated in a matter of fact way, with no real data cited anywhere to back it up. The only thing coming close to presenting a data backed argument then says "we'll look at this in a future part of the series".
He does say this is the first of a series of articles, so I think this is an uncharitable criticism. Dependency management is a very big topic and talking about it with sophistication — which you need to do to actually solve the problem well — takes a lot of words.
> I think Russ did an unusually good job of explaining the ideas around vgo, grounded by using real world examples.
Sure, it's always easier to understand something when you talk about it in terms of small carefully-crafted examples. But with something like the emergent effects of dependency management tool UX choices on the health of a software ecosystem, you lose a lot when you limit your discussion to "here's what the algorithm does on these three toy packages".
1. Probably this dependency version works, because I have no reason to believe otherwise
2. This dependency existed at the time I released a version of my library
3. This dependency existed, and I used it during testing and development
4. This dependency is co-maintained, so future versions of the dependency are likely to be compatible per semantic versioning
5. My library means nothing to this dependency, and they may or may not break it based on their whims
6. We don't really talk, but if there's an issue whoever you tell first may well fix it
7. I will release dot-releases to make sure future versions of the dependency are supported
8. I have specifically confirmed certain versions of my library and the dependency are incompatible
Most of these assertions are social and incomplete. Assuming a version, once released, is never changed or retracted, then one of the few firm guarantees you can declare is incompatibility. Declaring compatibility doesn't make it so, it only indicates a desire (and it's not even clear whose desire).
On top of that, compatibility should really be more like metadata, not included in the package. Compatibility and incompatibility is a discovery process (along with security).
In practice it's fine though, because it's ultimately the integrator (the person making some actual application) who has the responsibility to ensure everything works together.
The expression is burying the lede, not lead.
This is consistent with what I've seen with systems like Cargo. If I had to make a list of top 10 issues I run into with Cargo, theoretical scalability of the core dependency resolution algorithm wouldn't make the cut.
If you compare git to rival tools such as Mercurial and Fossil or even Darcs, it's pretty evident that git didn't really win on technical merit, user friendliness or indeed hardly any other metric. It was fast, and it had the dubious advantage of being better than a lot of crappy alternatives at the time, and of being the product of a famous developer. But if technical merit was how we chose software, we might all be using Mercurial today. It's quite possible we'd be better off, too.
git's punting of "certain of the hard theoretical problems" lead to major shortcomings in its design. For example, git does not track branching histories. Once you merge your changes back into a branch and delete the origin branch, the path that your commits took are lost. Another example: Since git's data model works entirely on repository snapshots (there are no "patches" or logical operations, only snapshots), it doesn't know about file renames. Some commands, like log and diff, do understand how to detect renames (but not out of the box!), but it's fundamentally a fuzzy-matching operation that can only work perfectly if you separate your rename commits from modification commits, which is rarely feasible.
Don't get me wrong, git has overall been a force for good, but I think it's a pretty terrible example in this particular case. In fact, I'm not sure I can think of many cases where the "worse is better" philosophy was ultimately a good idea. In every instance that comes to mind (MySQL, MongoDB, PHP), the fast-and-loose attitude ultimately came back to hurt everyone, and years were spent paying for those mistakes, plugging one hole at a time.
The original decision by the Go team to build package management into the language ("go get") was made without really understanding the problem, and this decision has been haunting us pretty much since day one, with developers having to chase unofficial package management tools of the week in order to get their work done. I really want it to be done right, even if takes a bit longer. It's a problem worth solving well.
> git may have become successful, but that doesn't mean it's right.
If bitbucket had that business model instead of the inverse we’d all think of git as that weird bad hg that Linus forces the Linux devs to use.
Git is a classic example of other factors than the software driving use.
Seriously give it a try
Git would be dead on the water if it wasn't for being written by Linus and a requirement to interact with Linux development.
Your statement is only true to the degree that it's a tautology: Git's popularity demonstrates that it's right and the definition of "right" is "popular".
But if you want to use "right" to mean anything else, maybe to say something about it's technical merits independent of sociological factors like the network effect, first-mover advantage, high-profile early adopter, etc. then your sentence doesn't add any information.
Personally, I think Git's user experience is an unremitting shitshow, the kind of disaster that makes one reconsider Hanlon's Razor.
The technical capabilities buried under that UX are pretty nice, though you could probably discard 1/3 of them without impacting any noticeable fraction of users.
The performance is excellent and it's very easy for developers to underrate how much that effects user satisfaction.
And it had the good fortune to win on almost all of the sociological factors that largely determine product success.
When git became successful, there were scant few other options with a comparable feature-set regardless of speed.
Its distributed nature, ability to easily merge and rebase sets of changes, etc, were all wonderful solutions to real problems.
I'm unconvinced that its success was because it solved fewer problems than the state of the art, but rather that it solved more. This is a stark contrast to vgo which is intentionally solving fewer problems than other modern dependency management tools.
In addition, git must be performant enough to handle a git repo (namely the linux kernel, its original usecase). Beyond that, I think there's no evidence it intentionally cut features or complexity to be faster.
In the case of vgo, it cuts additional user-facing features in order to be faster, but there's 0 evidence that the speed matters, that there are any go projects in the world that cannot be solved quickly enough with an approximation of a sat solver.
At the time, the choice of git wasn't that obvious. There were several popular, quite attractive contenders, including Mercurial, Monotone, Bazaar and GNU arch (and its forks), and it wasn't obvious who would win. But then Github arrived and changed everything, and it felt like everyone was suddenly caught up in a historic momentum whether they liked it or not.
We've had lots of those historic moments, some of them more slow-moving than others. Mac and NeXT being outmaneouvered by Windows; Lisp, Dylan and Smalltalk being relegated to the dustbin through the rise of more popular, worse languages; and so on. Not all of these developments are terrible (I love that Nginx prevailed over Apache and that Rails took over from Java app servers), but it's ridiculous how often the paperclip solutions win over the more thoughtful ones.
Git won almost entirely due to two factors.
1. The linux kernel used it so that brought some prestige.
2. Github made git hosting easy and free for a lot of people and had the cultural cachet to drive adoption.
Git has a very solid underpinning. But it's user interface and lack of guard rails has always made it painful. People endure the pain because if they don't they won't be able to use the tool that their industry has chosen. But some of us wish a different choice had been made in the dvcs arena.
I remember installing cogito to act as a front-end for git because very early on it seemed obvious that (a) git was going to win, and (b) it was going to win in spite of its UI, which was saying something.
The thing is, modern dependency management tools for development suck. They are complex and slow and terribly unreliable.
A fresh, unconventional approach is needed, and vgo may be it.
They were glacially slow and difficult to use. But they had very roughly the feature set; in particular the distributed nature.
In a totally different domain with involves working diffs for and managing gigabytes of source code....
Sure you can get around it with having a common workspace where all build artifacts are dumped, but still have to go for a coffee while dependencies like gtkrs-pango get compiled.
Problems can be NP-complete. Algorithms cannot - they can be exponential time (and if your choice is between exponential and double exponential, you pick the former!)
In practice, and especially for SAT solving, as far as I understand the state of the art is that the general case is NP-hard but we have tools that work surprisingly well in those cases that reality tends to produce. In some more academic cases, we even have heuristic solvers that are not even guaranteed to terminate, but usually do so within a couple of seconds, so you can get away with the "beginner's solution to the halting problem".
You're correct. Modern version solvers need fairly sophisticated algorithms, but it's not rocket science. It's mostly a non-problem for most users of most package managers most of the time.
vgo bundles several improvements, a major one being the introduction of "modules", which are versioned collections of packages. Modules, among other features, will let us decouple the import path from the file system structure and source repository, making the awkward and much-maligned $GOPATH finally obsolete.
vgo also introduces a new dependency resolution algorithm, minimal version selection (MVS), which is different from traditional dependency resolution algorithms in that it will at any time choose the oldest allowed version that satisfy the minimum version constraints specified in the list of imported packages, where other systems will choose the newest. As MVS does not require (as with dep) a complex boolean SAT solver with potentially pathological, NP-complete cases, it is much simpler and faster to compute. However, it also has downsides. (Edited, thanks to pa7ch for correction.)
The author likes pretty much everything about vgo except the MVS algorithm, which has been deeply controversial since the proposal was first published. The author goes into explanations of why he thinks MVS is a bad fit, but this is a complicated topic, so you really have read both the vgo proposal and this rebuttal to understand what it's all about.
The article was written by Sam Boyer, one of the designers/authors of dep. While he has been courteous about it, there are obviously emotions at play; during its development, dep seemed to have the blessing of the official Go team, and this proposal came out of the blue and probably felt like an ambush.
Other algorithms would select the newest release within a major version even if no module has ever tested it and it was released 5 seconds ago.
When authors do make incompatible changes and MVS selects a broken dependency, as expected, there are manual escape hatches. More complicated NP complete algo's would have more of an automated answer here by allowing dependencies to have boolean constraints (greater then X, less then Y, not Z etc.) on version so that there is this collaborative summation of constraints to work around authors violating semver or bugs. To summerize poorly: It seems Sam Boyer regrets that dep didn't do what vgo does by default, but believes that a SAT solver could kinda do MVS for the happy path but still solve against the introduction of boolean constraints.
Personally, I'd like to let vgo play out a bit, I'm not convinced boolean constraints are something I really want imposed on me by my dependencies.
More info for those who are looking for more context:
Whether you agree or disagree, its novel research in this field and excellent technical writing worth reading in its entirety over a cup of coffee.
In other systems like Cargo and NPM you have a "lock" file which specifies the exact current version of dependencies to use, e.g. if you say you depend on "foo >=1.0" and when you build your project it actually downloads "foo 1.4", this will be written to the lock file. When foo 1.5 is released, it won't be automatically upgraded even though you said it should be fine.
vgo is exactly the same. When you first add foo1 it will implicitly (via semver) assume you meant "foo >=1.0" and download the latest version - foo 1.5. That version will be written to the lock file (go.mod) and when foo 1.6 is released it won't be automatically used. Exactly the same as NPM and Cargo.
And just like NPM and Cargo you can trivially update to the latest compatible versions of your dependencies with `vgo get -u` (like `cargo update`).
It's not that different. The main differences are you can't specify maximum versions, and different major versions are treated as different packages so you can easily have them both.
It relies completely on the community adopting "Semantic Import Versioning", which is basically "v2 is any breaking API change, which must live along-side v1 because it's actually a totally different import". So it blends some of the pros and cons of both dep/cargo-like package management with those of npm's.
It's... interesting. And I love that it's not an easy answer either way - it's a significant alternative, and it's bringing up lots of new(ish) ideas and approaches that have kinda fallen on the side. That said, I've had concerns about vgo from the beginning, primarily around the "does this make sense in an ecosystem". Package managers are not just "this algorithm is efficient", they have major code-culture effects (and the ecosystem can support or destroy the package manager's goals). This article is hitting most of my concerns squarely on the head, in quite a bit more detail than I've managed to figure out.
Also, https://medium.com/@sdboyer/so-you-want-to-write-a-package-m... is very good. sdboyer has been thinking about this stuff for quite a while - emotions may be there, but he has a LOT of experience backing it up.
: To work off / echo pa7ch's comment here: it picks the highest minimum when there are conflicts in transitive dependencies. So if you use A (which needs C v1.1+) and B (which needs C v1.3+) you get C v1.3, even though v1.4 is available. You can upgrade to v1.4, but only by putting "C at v1.4" in your top-level dependencies list (which the tool automates for you) to override those values, and it's assumed safe because it's still importable, therefore still v1-compliant.
I was sceptical but now kind of sold on the idea after wathing the presentation.
I find MVS to be a very natural and simple solution to versioning - I feel like it's much closer to how people actually manage their dependencies in practice.
After an admittedly low-effort skim of this article, most of the arguments seem to be that the author doesn't think it "feels right". Is there something more concrete hidden in here?
Golang is hurting right now because of this issue. In fact this and the lack of a canonical GUI approach are the only two caveats I have when suggesting the use of Go.
I'm not saying to rush, but the constant churn in this space needs to end as soon as it can. For a language with as much of a focus on simplicity and stability as Go, it's embarrassing that this has been an issue for so long.
I have been using Go professionally starting sometime between 1.1 and 1.4 and in that time I have something like 4-5 different ways of handling dependencies in my repositories based on when those projects were started and/or last overhauled. Each time I changed my approach I was following the current best practices, or so I believed. It's madness, and it has to end sometime.
: Started playing with it in 2009, in fun projects I'll use whatever works and/or is fun and/or gets the job done soonest. In professional projects I'm much slower to adopt new tools.
Go was hurting _three years ago_ because of this issue. The pain was tractable and critical. The community made a thorough and good-faith attempt to end the churn with `dep` -- which was summarily dismissed, ignored in part and whole.
I don't see why `vgo` should get the fast track now, when the core team has been dragging their feet on the issue for years.
I've used `dep` on a few projects and I don't mind it. If that's what we use that's fine with me.
> I don't see why `vgo` should get the fast track now, when the core team has been dragging their feet on the issue for years.
I think you misunderstand my point. I'm not throwing my weight behind `vgo`, I'm throwing my weight behind making a long term decision of any kind.
I get that it's important to take your time and get things right. That way you can avoid starting down one path and wasting everyone's time, then switching and declaring that everyone follow you down this one, and so on, and so on...
Oh wait - that happened anyway.
I disagree. I want things to move forward as soon as possible. I've been waiting so long for a dependency management solution and for GOPATH deprecation!
Russ came with his "greenfield" proposal because he was not convinced that dep could become the long term solution to dependency management in Go.
Sam and the whole dep team have been working on package management for more than a year now. If they see important issues with vgo, I think they should be able to explain them succinctly in a post, at least to convince other gophers to hold on.
the elm package manager seems to take a step in the right direction where changing a function signature will force a version bump if you want to publish the update. (iirc)
are there other languages/ecosystems that do something interesting with versioning?
That is a very lazy argument.
It's perfectly reasonable to claim something is flawed for numerous complex reasons.
It's equally reasonable to claim that a complex flaw may be difficult to explain concisely.
Most importantly though, I don't think Sam is claiming that there's an obvious fatal flaw in vgo, but rather that it is not the ideal choice given the problem space.
To make that more subtle argument, it is necessary to fully define the alternatives and problem space, which is what a lot of the words are attempting to do.
The tl;dr would be along the lines of "vgo knowingly makes a set of tradeoffs which I think are worse than the tradeoffs that are possible with another hypothetical option, which I am also proposing"... but that tl;dr is not really a very useful one, and attempting to condense a more meaningful one will lose a bit too much nuance I think.