It's worth noting that go modules use a novel dependency resolution algorithm which is extremely simple to reason about/implement, fast, and produces more reliable builds than npm/bundler/cargo. That's why I was excited about it, anyway. It removes the ever-present NP-complete assumptions in this space, so from a computer science perspective it's extremely interesting.
I've heard/read this, but I can't tell what is necessarily novel about it...to me, it reads like old-school/boring Maven transitive dependency resolution.
(Not holding out maven as best practice, it's just what I know best in terms of pre-version ranges, pre-lock file dependency management, once those features become state of the art in ~2010.)
...that said, Maven does actually support version ranges; ~10 years ago when I last used it, either it didn't support them then, or we didn't use it, so perhaps that is why vgo seems so familiar. Or I just have a terrible memory.
Anyway, if anyone can correct me on my fuzzy assertion that "vgo is like maven w/fixed versions", I'd appreciate it!
vgo's more-constrained specification for dependencies means there is exactly one right answer, and it can be easily and quickly calculated by both computer and human.
Whether or not this will turn out to matter in practice is still an open question.
How big of a deal is this IRL? Assuming you have 1000 modules, how long should it take to solve the graph?
The only once constraint also has a nice out for the SAT solver, if you reach a conflict or something that can't be solved cheaply you just make the user select a version that may not be compatible with the constraints. Bower, dep, and maven work that way.
So, I'm not sure how happy I would be as a user if my package installer bailed out and asked me to choose!
Out of curiosity, how do you mark your package as "only once" in cargo? I tried googling, and didn't find an answer, but did find a bug where people couldn't build because they ended up depending on two different versions of C libraries!
It does make wonder if MVS will solve real pain in practice. :-)
Its definitely not a great UX, but at the end of the day the problem can only be solved at the language level or by package authors choosing new names. For instance in java you can't import 2 major versions of a package. Solving for minor versions having to bail out has been incredibly rare in my experience. I only see it when there's "true" incompatibilities, e.g.
bar: foo (<= 1.5)
> Out of curiosity, how do you mark your package as "only once" in cargo? I tried googling, and didn't find an answer, but did find a bug where people couldn't build because they ended up depending on two different versions of C libraries!
I think its the `links = ""` flag. It may only work for linking against C libraries at the moment, but cargo understands it!
> It does make wonder if MVS will solve real pain in practice. :-)
Not by itself, the semantic import versioning is the solution to the major version problem, by giving major versions of package different names. Go packages aren't allowed to blacklist versions, though your top level module is. This just means that package authors are going to have to communicate incompatible versions out of band, and that the go tool may pick logically incompatible versions with no signal to the user beyond (hopefully) broken tests!
Yeah, it seems if the Go system ends up not working out in practice, this will be why.
But because of the minimal nature of MVS, you won't run into this problem unless something else is compelling you to go to the broken version. And by the time that's happening, you'd hope that the bug would've been reported and fixed in the original library.
It'll be interesting to see how it plays out in practice.
(Also, if I have a library A that depends on B, which is being reluctant or slow about fixing some bug, I can always take the nuclear option and just fork B and then depend on that. Basically the explicit version of what Cargo would do through deciding it couldn't resolve w/o creating two versions. But I think the incentives might be set up right that the easy/happy path will end up getting taken in practice.)
1. Singletonish things like global allocators, rayon-core, etc
2. You may have made a package static safe to mutate with a mutex, but it could be a bad thing to have different versions of that mutex.
3. Compile time computed tables (unicode table, perfect hashes, etc) could be imported multiple times ballooning the binary.
4. ABI/type compatibility with any reexported types
Yeah, go has a magic function `func init()` which gets called before main. (You can actually have as many init's as you want, and they all get called.)
Probably evil, though so far it hasn't hurt me in the same way as, e.g., c++ constructors have. Maybe because it's more explicit and thus you're less likely to use it in practice.
1. vgo focuses on the wrong issue (if you're spending a ton of time resolving and re-resolving your dependency graph, maybe the issue is your build process).
2. vgo will get the wrong answers and/or make development much harder
There's a long writeup of some of the ways vgo can go wrong here: https://sdboyer.io/vgo/failure-modes/ and some background here: https://sdboyer.io/vgo/intro/ among other places. And there was a lot of discussion here: https://news.ycombinator.com/item?id=17183101
I'd say there's about a 3% chance vgo ends up being a smashing success that revolutionises package management and gets copied by everyone else, a 30% chance that vgo works well for golang due to their unique requirements but has nothing to offer anyone else, and about a 67% chance it ends up being a failure and being scrapped or heavily revised to scrap the novel, controversial and (arguably) fundamentally broken ideas that set it apart from every other package manager.
But fundamentally, the reason the cargo people aren't copying it right now is that it doesn't even really claim to have advantages over cargo for rust. (There are some quirks in the golang ecosystem which mean you end up analyzing your dependency tree way, way, more than you do in basically any other common language. That makes speed important for golang, but for everyone else, it's almost meaningless.) "We make the unimportant stuff fast at the expense of getting the importing stuff wrong" isn't very compelling. :)
Of course, the vgo people would phrase it as "we make the important stuff fast and we get the important stuff right", so...time will tell. But don't expect anyone to copy this quickly; it remains to be seen if it'll even work for golang, and it'd need to be a huge step up from the current state of the art to make it worth switching for other languages and ecosystems.
What are these?
Basically, dep/glide do a bunch of stuff, including recursively parsing import statements because of How Go Works (tm). Other package managers don't, because they have lock files, and central repositories. Go expects you to just be able to throw a ton of raw code into your GOPATH and have it all magically fetched from github, which is super cool, but also very hard to do quickly, and not really something other languages are clamouring to support.
(A lot of attention has been focused on vgo's solver, and it is much faster, but the solver isn't what takes up all the time; the speedup from dep/glide to vgo seems to be almost entirely related to the changes in how dependencies are declared. Saving 10ms on a more efficient solver algorithm means nothing if the overall process is spending 12s grinding through slow disc and network access.)
And when you survey the language ecosystem, you see a lot of languages very enthusiastically committed to traditional package managers (with lock files) and centralised repositories. Cargo, composer, npm/yarn, bundler/ruby gems - recent history is full of languages happily moving in that direction. Go is an exception, and I don't see anyone actively copying that quirk any time soon.
When you just decide not to address a significant part of the problem, the solution becomes simpler.
You mean a bug? Because that's what that is and it is no different from any other bug, and like any other bug they are outside the scope of dependency specifications as they are unintended.
Known relevant bugs in particular versions of dependencies are not outside the scope of what non-vgo dependency management solutions address.
Fixing the bug creates a new version. Unless you are going to create the mess of unpublishing packages or replacing packages with new different ones with the same identified version (both of which are problematic in a public package ecosystem), the fact that maintainers should fix bugs that occur in published versions doesn't , at all, address the issue for downstream projects that is addressed by incompatibility declarations in a dependency management system, even before considering that downstream maintainers can't force upstream maintainers to fix bugs in the first place.
Overall, I am trying to be optimistic about the future of Go dependency management, but I am not planning to switch the projects I work on in my company from dep to Go modules until most of those rough edges are either smoothed, or officially recognised as "works as intended", with viable workarounds.
With that said, some none negligible chunk of the issues you linked fall into cosmetic (better error message), proposals, works as intended (docs could improve) or self inflicted misconfigurations of running a go beta. 53 sounds worse than it is.
I don't know why this had so much drama around it to be honest. And yes, it is far from perfect, it will be approximately as crummy as python/ruby/node which is still an improvement.
(all praise mwhudson for maintaining the Go snaps -- `snap info go` for the whole story).
It lists the channels and some versions, and has a short description; "This snap provides an assembler, compiler, linker, and compiled libraries for the Go programming language."
When you said "the whole story" I expected there to be some sort of story but I guess I might have misunderstood what you meant.
I'm not sure if I'd use it for deployment - but for development it's quite versatile.
It was not fun when VS Code (with the Go plugin) would automatically remove my imports every time I saved the file because it couldn’t find it.
./example.go:8:9: imported and not used: "net/http"
 I know it's not the official upstream remote, but I find this one the easiest to remember.
The rather late support of "out-of-tree" building might be caused by a conflict between groups pressing to refrain from the $GOPATH approach (rising in size due to rising Go popularity itself) and Go Dev Team / early-adopters especially as the $GOPATH approach is a central part of Go.
I've tried it. Can we now get real modules?
Go is a Google project, and Google has a very unique approach to package management: commit everything to the monorepo. The GOPATH is, in essence, a monorepo. If you want to change the API of a library, well, you can just change all its callers across your GOPATH, too. And so for a long time the Go team was unconvinced that package management was a problem.
For example, the Go FAQ  still has this to say:
> How should I manage package versions using "go get"?
> "Go get" does not have any explicit concept of package versions. Versioning is a source of significant complexity, especially in large code bases, and we are unaware of any approach that works well at scale in a large enough variety of situations to be appropriate to force on all Go users...
> Packages intended for public use should try to maintain backwards compatibility as they evolve. The Go 1 compatibility guidelines are a good reference here: don't remove exported names, encourage tagged composite literals, and so on. If different functionality is required, add a new name instead of changing an old one. If a complete break is required, create a new package with a new import path.
It is true that if you write perfectly backwards compatible code, then you don't have a versioning problem, but if you think that's a viable solution you're ignoring certain realities of software engineering.
It wasn't until early last year that Russ Cox  publicly declared that versioning was a problem and set out to introduce a package manager into the Go toolchain. As it turns out, GOPATH is entirely incompatible with the approach to package versioning that the Go team settled on. You simply can't have two versions of the same package in your GOPATH, unless you're willing to rename one and rewrite all the import paths. Given that public opinion had turned again GOPATH , it was finally time to do away with it.
So it took about a year and a half from the time the Go team admitted GOPATH was a problem to shipping a release that made it unnecessary. That's really not too bad. The frustrating part of this saga were the first seven years during which the Go team refused to admit there was a problem at all.
My personal theory is that what made Russ Cox cave in was his discussions with Sam Boyer. Cox thought Boyer was going down the wrong path, and thought he had a better solution. Unfortunately, the Go community didn't seem to have read the discussions the two were having, because pretty much everyone thought Dep (Boyer's tool) was blessed by the Go team and was going to be the official package management tool. I can forgive the drama of the end result is a real, non-Google package management system, though.
(While I didn't appreciate the drama, I'm somewhat relieved Dep is not going to be the official solution. Dep is okay when it works, but inherits pretty much all the warts of Glide, which Boyer also worked on. Glide has been an absolute nightmare to work with. Dep is in fact worse than Glide in some respects -- due to weaknesses in its solver, it's completely incompatible with certain significant community packages such as the Kubernetes client. Of course, Dep is not yet 1.0, but I would not say things were looking that promising.)
That's certainly my understanding of the situation. Matt Farina has a great commented history of dep and vgo  if you haven't already seen it. The comments are particularly enlightening.
Still, it's not clear to me what made the Go team get into the package management game at all. As you say, for years they were happy to leave that as a community problem. But something spurred them to declare that Dep was an "official experiment."
> Dep is okay when it works, but inherits pretty much all the warts of Glide, which Boyer also worked on.
Funny, I've had exactly the opposite experience. Glide caused us plenty of trouble at CockroachDB, but Dep has worked flawlessly, if slowly. I've also found Sam to be exceptionally friendly and responsive to feedback  .
> The issues pale in comparison to the horribleness of Glide, but it's interesting just how these tools end up being so damn flaky.
That's the crux of it, isn't it? The dozens of Go package managers that have come and gone over the years have provided us with substantial evidence that building a stable package manager requires several years of development. I think that's why I'm frustrated that the Go team hit the reset button again. Dep has accumulated plenty of bug fixes over the years to handle more and more of these edge cases, but vgo had to start from scratch.
On the bright side, vgo essentially can't fail.
I have numerous thoughts about developments like this one. Mostly, that I've seen a similar situation happen in numerous communities already, Go totally not being the first nor the last one, that the steering commitee have the last say, and may have different taste. I learnt to accept that their choice usually does have merit and usually actually ends for the better. I learnt that it requires a lot of humbleness and sometimes gritting one's teeth, learning to let go of hurt feelings, and accepting that someone else may have reasons you still need to grow to understand. Personally, my own view is that for Sam, this was probably the first time something like this happened, and he wasn't prepared for the hit. And I agree those never stop tasting bitter, given the work one has put in a project of this kind. A good will, hard contribution, being de facto rejected in the end. A child being "lost". But that's not the whole truth, because the child is in this case reborn, though in somewhat different shape. The experiment has served its purpose and brought a lot of value, a significant contribution. On the other hand, I do sometimes wonder, can such situations and misunderstandings be avoided somehow? Or is the world just not perfect enough? And by the way, I also think that Russ was actually taken by surprise by the extent of the reaction. I suppose that's why it took him so long to react, which let the situation and complains get somewhat louder than necessary.
But that's too just my personal opinion. One of many in this somewhat unfortunate situation. I just wanted to also let my steam off in the end, starting to grow more and more tired of the recurring claims that "everybody is surprised". On the contrary, I'm personally one of the people looking forward to vgo, and strongly unconvinced by what dep has become.
The Dep situation is very similar to that of Eric S. Raymond's attempt at replacing the Linux kernel's config tool. Instead of presenting a design proposal and discussing it in public, he pretty much finished the project on his own, perhaps thinking that a working version would lead to adoption by users and thus forcing the kernel team to accept something users liked. Or perhaps he assumed he had clout in the Linux community, which of course he didn't. Either way, this kind of brute-forcing just leads to wasted work and resentment.
That's fine and works perfectly well in the appropriate environment (e.g. large structured work environment with processes etc etc), but for my personal work I prefer to just checkout wherever the hell I want and go from there. Really looking forward to module support so I can use golang for some personal small-scale projects easily without having to go through a lot of the ceremony of setting go up on say a raspberry pi - just checkout and go (no pun intended) will be a breath of fresh air.
You are right that it is solvable. And in many other languages the package management etc have grown independently of the core language itself, from the community. So the attacks on the golang team seem a bit strange. Maybe they thought that these solutions would come from outside the core language team.
In fact, the one main reason I'm looking forward to this release is because of the elimination of GOPATH. It'll make my day-to-day operations at work FAR easier.
It's true that it encourages working with all the code in GOPATH. This is a good thing. Your GOPATH is a view of the whole Go ecosystem. You fix a bug where it makes sense and it is picked up by all users. Sadly, vendoring already messed this up.
I think it's insanity when every program has a different idea of what code a given import path refers to (like is often the case with project-based package managers and vendoring as well). It's no fun to juggle the version differences in your head while working on multiple projects.
Go modules have some good ideas here. Semantic import versioning hopefully reduces the number of different versions you have to consider.
Doing the version selection not per-project but globally for all of GOPATH should still result in a working (but not necessarily reproducible or high-fidelity) build. It definitely reduces the amount of code you need to deal with.
Modules and vendoring
When using modules, the go command completely ignores vendor directories.
By default, the go command satisfies dependencies by downloading modules
from their sources and using those downloaded copies (after verification,
as described in the previous section). To allow interoperation with older
versions of Go, or to ensure that all files used for a build are stored
together in a single file tree, 'go mod -vendor' creates a directory named
vendor in the root directory of the main module and stores there all the
packages from dependency modules that are needed to support builds and
tests of packages in the main module.
To build using the main module's top-level vendor directory to satisfy
dependencies (disabling use of the usual network sources and local
caches), use 'go build -getmode=vendor'. Note that only the main module's
top-level vendor directory is used; vendor directories in other locations
are still ignored.
OK, so that makes it sound like it has nothing to do with GOPATH.
But interesting (annoying?) that vendor won't kick in unless you specifically ask for it.
> Very nice, go build ignored the vendor/ folder in this repository (because we’re outside $GOPATH)
> Oddly these are stored in $HOME/go/src/mod not the $GOCACHE variable that was added in Go 1.10
Maybe Go 2.0 will be stable.
Or you're using the isolated-GOPATH trick.
$ mkdir -p .gopath/src/github.com/foo
$ ln -s ../../../.. .gopath/src/github.com/foo/bar
$ GOPATH=$PWD/.gopath go install github.com/foo/bar