Go does not exist to raise the profiles of Sam Boyer and Peter Bourgon. Sam wanted to be a Big Man On Campus in the Go community and had to learn the hard way what the D in BDFL means. The state of dep is the same as it was before - an optional tool you might use or might not.
Lots of mentions in Peter's post about things the "dep committee" may or may not have agreed with. Isn't this the same appeal to authority he is throwing at Russ? When did the "dep committee" become the gatekeepers of Go dependency discussions and solutions? Looks like a self-elected shadow government, except it didn't have a "commit bit". Someone should have burst their balloon earlier, that is the only fault here. Russ, you are at fault for leading these people on.
Go is better off with Russ's module work and I personally don't care if Sam and Peter are disgruntled.
Ya, I've been following all this drama since the vgo proposal and I also find it somewhat rich. I mean in a way didn't the `dep` folks do exactly the same thing to the `glide` and other existing solutions? At some point you just have to say, hey, you've done good work but I think another approach is better. They did that to glide and Russ did that to them.
Note also that he doesn't dispute that Russ tried to bring them into the fold with his concerns and direction but they rejected his arguments. That's a fight they should have known they were going to lose.
Maintainers / BDFLs get the final say, that's just how it is. Just because there is a large community does not mean they get the final say, just because you put in a lot of work does not mean you get the final say. The reason BDFLs are in the position of making that decision is because they have proven through the accumulation of their previous decisions that they get it right more than they get it wrong. And that's the very reason there is a community around them at all.
If you walk into ANY project, public or private, open sourced or closed, and ignore the lead's objections when you go off in a direction they don't agree with, then yes, you are very likely in for some wasted work. That's just how the world works.
On another note I hope this is the last we hear about all this, please lets move on.
Ya, I've been following all this drama since the vgo proposal and I also find it somewhat rich...On another note I hope this is the last we hear about all this, please lets move on.
I only became aware of this by going to a meetup and seeing a talk. I just want something that's workable and not broken.
The reason BDFLs are in the position of making that decision is because they have proven through the accumulation of their previous decisions that they get it right more than they get it wrong. And that's the very reason there is a community around them at all.
This is the same reason given for totalitarian rule generally. The problem is that the dictator has enough power to make a big mistake, committing the entire communitiy's resources. In this position, if you can avoid making a decision, you should. Toyota doesn't simply make such decisions by fiat. Instead, they have a competition.
I mean in a way didn't the `dep` folks do exactly the same thing to the `glide` and other existing solutions?
Note that Russ isn't going against the community, he's just going against `dep`. Most of the community, myself included is pretty psyched about go modules.
Note also that the risk you propose for bad decisions is mitigated for Open Source. If that decision really turns out to be so bad then it will be forked and a wiser leader(s) will have their shot at decision making.
`dep` isn't squashed! You are free to continue using it, it's just that they likely won't find much audience anymore.
> If that decision really turns out to be so bad then it will be forked and a wiser leader(s) will have their shot at decision making.
No, that doesn't work in practice. The network effects are so insanely dominant in open source projects that forks are nowhere near an efficient market.
That's survivorshop bias. Yes, occasionally a fork takes over. It very rarely succeeds and usually only in cases where the original is so toxic that it cancels out its own network effects. node.js is a good example of that. And then even there note how they eventually unforked.
It is really uncharitable to describe the node fork as "so toxic". People disagreed, but there's nothing wrong with that. I felt like the subsequent "merge" happened fairly quickly and with minimal drama.
Also this seems like a misuse of the term "survivorship bias". You claimed upthread that forks don't work because of network effects. The response was "some forks work". That is a direct refutation, no matter what the percentages are. Besides, you're misunderstanding the varied purposes of forks. I have several forks right now, not because I hate the original maintainers but because it was convenient to change a small thing for my own purposes. Whether or not upstream eventually agrees with me, my fork "works" perfectly well.
> Most of the community, myself included is pretty psyched about go modules.
Citation needed. In my corner of the community this is a very contentious issue.
The golang dependency management story has been a disaster for years. Nothing about the modules or vgo story have seemed to solve that.
In fact, its Russ who seems to be going against the norms of the community. Choosing cutting edge and untried technologies, over well understood and tested ones. If he was going to get edgy with his leadership decisions couldn't he have done it somewhere more valuable like cleaning up the type system?
The only curious element here is that a (the?) judge also had the winning entry. Russ claims he got buy-in from the other core members and so far no one has contradicted that.
AND...vgo is better than dep. Let's not forget that. Peter can hem and haw about politics but in the end the best horse won.
Go developers want the tools to be as good as possible, 99% of them don't know anyone who contributes to Go by name so this drama is irrelevant to them.
> AND...vgo is better than dep. Let's not forget that. Peter can hem and haw about politics but in the end the best horse won.
Nobody has worked with minimum version selection and compared it to the traditional approach enough to know. I have serious reservations as to how it will work in practice.
After reading TFA, which tries to be self-serving but is mostly just silly, do you even doubt it? I say this as someone with no dog in any Go fight. When the defendant's own testimony convicts himself, he's guilty.
> This is the same reason given for totalitarian rule generally. The problem is that the dictator has enough power to make a big mistake, committing the entire communitiy's resources. In this position, if you can avoid making a decision, you should. Toyota doesn't simply make such decisions by fiat. Instead, they have a competition.
You're comparing very different things. Programming languages and empires/nations/companies aren't the same thing. Rails has stayed on the course DHH wants it and hasn't become a mess mainly because DHH remains the BDFL. Go could benefit from similar structure. We're now about to see how Python will do post-Guido.
> Rails has stayed on the course DHH wants it and hasn't become a mess mainly because DHH remains the BDFL.
\* and DHH happens to make good decisions frequently enough.
For every successful BDFL example, history is littered with a hundred dead languages and frameworks designed by a single visionary leader who made the wrong decisions.
> a hundred dead languages and frameworks designed by a single visionary leader who made the wrong decisions
E.g. there's the namesakes for Rails and Ruby, i.e. Grails and Groovy. Virtually no-one's upgraded to Grails version 3 since it came out 3 yrs ago, or started new projects in Grails version 2. The Grails 2 plugin ecosystem is as good as dead. It only has its "single visionary leader" listed for the 3 contact persons (owner, admin, tech) in the grails.org DNS registration.
As for Apache Groovy, it's hanging on as the build language for Gradle and Android Studio, but doesn't seem to have any other significant use besides its original use case of glue code and testing harnesses. Groovy's problem is its creator, who had successfully added closure functionality to a clone of Beanshell, left the project after 3 yrs and the "despot" at Codehaus who subsequently claimed the title Project Manager was someone who didn't have the aptitude for many programming tasks.
The only meaningful example of this I can think of is Larry Wall and Perl...but his mistake was in loosening his grip instead of tightening it. Perl6 started off as a utopia wherein Larry was only considered slightly more powerful than any other contributor. This resulted in a revolving door of clowns taking the reigns and rapidly resigning as Larry looked on.
Fortunately Perl5 still does the Pumpking thing which is effectively a rotating BDFL.
> This resulted in a revolving door of clowns taking the reigns and rapidly resigning as Larry looked on.
I would say that this hasn't been the case for the past 10 years at least. Patrick Michaud has been Perl 6 pumpking since then, to make room for Jonathan Worthington about 2 years ago. Hardly a revolving door and hardly clowns.
Parrot did burn through quite a few pumpkings. I wouldn't call them clowns either, but I do think they led Parrot astray. Obviously Parrot is not Perl 6, but they are closely linked in many people's minds for obvious reasons.
(It's not all the pumpkings' fault, either. I think Parrot would still have been doomed if its leadership were constantly flawless, just because it was designed before Perl 6's object system was.)
> Russ failed to convince the committee that it was necessary, and we therefore didn’t outright agree to modiying dep to leverage/enforce it.
and
> The community of contributors to dep were ready, willing, able, and eager to evolve it to satisfy whatever was necessary for go command integration.
If you're willing to say: "Sorry BDFL, you didn't convince us this was necessary so we're not doing it." Then you're not willing to do whatever is necessary, BDFL objections are necessities. Peter and Sam just found out what happens when an unstoppable force meets a quite movable object.
I agree with the sentiment that Russ's only mistake here was leading them on for too long. I see this as an argument in favor of Linus' style of BDFLing, in this case he would have cussed them out over their design's flaws and told them in no uncertain terms that it was never getting merged and probably that they were stupid. It would have hurt more at the time, but also probably would have saved them a ton of time and, in the long run, might have even hurt less.
I've interacted quite a bit with both Sam and Peter. Your characterization rings completely false. I've never gotten the least hint of the impression they were doing this for self-promotion, but rather to serve to fill an obvious vacuum, perhaps even with a degree of reluctance.
I remember being excited when the “dep committee” was formed, because (if I remember correctly) it was set up with the explicit blessing of the Go Team, with a Go Team Member on it to facilitate two-way communication.
Your characterizations are both inaccurate and ugly.
I want vgo to succeed and I like most of your contributions, but comments like this and the original post (“Sam just wants to be the big man on campus”) are petty and rude. You’re better than this.
Additionally, dep is, and always was, meant as an experiment to learn from. Russ gave multiple concrete examples of things he learned from dep. That the dep authors don't take this as a win is telling. Pitching dep as the eventual Go dependency management system and having that not be the case is a problem they invented for themselves.
> dep is, and always was, meant as an experiment to learn from.
My impression is that the dep folks understand that too. The problem is that there is no consensus on what was learned from it.
The dep folks seem to have come away convinced that a SAT-solver approach is the better approach. rsc is clearly convinced of the opposite.
Everyone knows it is ultimately rsc's call, so I don't think talking about the power dynamics is very interesting. What I am more interested in is whether or not it's the right call. A good faith interpretation is that the dep folks aren't sad that their solution lost, it's that what they believe is a better solution lost.
I'll present a counterpoint of a different kind, since everyone is arguing about what current package managers do and don't do.
Satisfiability problems of this kind appear in a ridiculous number of fields and applications
(and not just by reduction).
The vast majority of them, in practice, are approximated rather than exactly solved.
Most of the ones that are exactly solved are in software verification, model checking, etc. Areas where having an exact answer is very critical.
Outside of that, much like you see in MVS, they approximate or use heuristics. And it's fine. You don't notice or care.
The idea that "package management" is one of those areas that absolutely must be exactly solved to generate acceptable results seems to me to be ... probably wrong.
There are much more critical and harder things we've been approximating for years and everyone is just fine with it.
(IE not clamoring for faster exact solvers).
Thus i have trouble saying a sat solver is a better solution.
It certainly would be a more "standard" one in this particular instance, but that's mostly irrelevant.
It's also a very complex one that often fails in interesting ways in both this, and other, domains.
> The idea that "package management" is one of those areas that absolutely must be exactly solved to generate acceptable results seems to me to be ... probably wrong.
The logical conclusion of your statement is that minimal version selection is the wrong approach! Minimal version selection is an "exact" solution, in contrast to the traditional one. You would only arrive at MVS if you considered the problem of selecting precise dependencies to be so important that it's worth making it the user's problem instead of having the tool solve it. The philosophy of the traditional package management solution is that it's best to have the tool do the right thing--select most recent versions that satisfy constraints--so that the user is free to worry about more important things.
> There are much more critical and harder things we've been approximating for years and everyone is just fine with it.
Yes! That's why MVS, and by extension vgo, is oriented around solving a non-problem!
> It's also a very complex one that often fails in interesting ways in both this, and other, domains.
I cannot name one example of a single time SAT solving has failed in Cargo.
I feel like you are really stretching here. You took a bunch of words out of context so you could parse them.
"Minimal version selection is an "exact" solution, in contrast to the traditional one. "
It's not an exact solution to SAT, it's an exact solution to a simpler problem than SAT (2-SAT). A problem that admits linear time solutions, even.
That is in fact, what a lot of approximations actually are - reduction of the problem to a simpler problem + exact solving of the simpler problem.
Some are heuristic non-optimal solvers of course, but some are not.
Certainly you realize the complexity and other differences between "an exact solver for SAT" and "an approximation of a SAT problem as a 2-SAT problem + an exact solver for 2-SAT"
I can write a linear time 2-SAT solver in about 100 lines of code and prove it's correctness. It's even a nice, standard, strongly connected component based solver.
So if i have an SCC finder implemented somewhere, it's like 20 lines of code.
Past this, your argument about "taking the user's time" is so general you could apply it to literally any problem in any domain. You can just plug in whatever domain you like and whatever solution you happen to like into this argument.
Here it's backed by no data - you have surfaced zero evidence of your premise - "that it is taking user time". This entire thread in fact has exactly no evidence that it's taking any appreciable amount of user time, so it definitely fails as an argument.
(in fact, the only evidence presented in this thread is that the algorithm simply works on existing packages)
If you actually have such evidence, great, i'm 100% sure that go folks would love to see it!
Because right now the main time spend, in fact, seems to be people arguing in threads like these.
> Certainly you realize the complexity and other differences between "an exact solver for SAT" and "an approximation of a SAT problem as a 2-SAT problem + an exact solver for 2-SAT"
I'm saying that the theoretical complexity of the core dependency resolution algorithm is irrelevant in practice. Therefore, removing useful features to reduce SAT to 2-SAT is not a good trade. I, and everyone else who has worked with Cargo, keep saying this, but nobody listens. :(
> Here it's backed by no data - you have surfaced zero evidence of your premise - "that it is taking user time". This entire thread in fact has exactly no evidence that it's taking any appreciable amount of user time, so it definitely fails as an argument.
Minimum version selection makes it the user's problem to fetch the newest version of dependencies. That is the entire premise of minimum version selection. If you want to upgrade your versions, you have to use "go get -u". That command blindly updates all minor versions of packages. The problem arises when you have some packages that did not follow the semver rules (or are on 0.x) and you need to hold them back to avoid breaking your build. That is when the more fine-grained version control that systems like Cargo support becomes essential. Vgo has unfortunately decided to omit that support in favor of some theoretical benefits that make no difference in practice.
> If you actually have such evidence, great, i'm 100% sure that go folks would love to see it!
The Go team has the evidence in that every other package manager uses maximal version selection instead of minimal version selection, because of the problems with minimal version selection.
I strongly suspect that the problems in minimal version selection will become apparent over the years as people hit the limitations, at which point it will become apparent that Go made a mistake, but it will be difficult to fix. In particular, I think that, several years down the line, there's a good chance that running "go get -u" in large software projects is going to result in a broken build, because people are imperfect and don't perfectly follow semver. So people just won't upgrade their packages very often.
In large software you'll not do go get -u for all packages, you'll upgrade each package separately, at the maximum version or at a specified one. It's just that it's you the user of the modules will choose what and when you upgrade, not the(this) tools automagicaly.
I think there's empirical evidence that the SAT-solver approach is not necessary. I have done an analysis on as many Gopkg.{lock,toml} files as I could find, and in no instances did it ever do any non-trivial version selection: the maximal available version at the time was always selected. Additionally, Russ has stated that ~93% of the top 1000 Go packages in the wild build successfully with no changes. I appreciate that they may be convinced, but I think they need to ask themselves what evidence would change their mind. They have had months coming up on a year to figure this out.
The more interesting question for me is what evidence would change Russ's mind?
The package managers for many successful languages and distributions use lockfiles and constraint solvers. Not only is that empirical evidence that it works technically, it is evidence that it works socially — users are able to understand and work with it, and the package ecosystems for those languages have evolved with those rules in place.
Empirical data from Go's own package ecosystem is useful too, but you can only learn so much about package management from a corpus that does not have sophisticated package management. The ecosystem has already learned to work within the restrictions so you'll mostly see packages that confirm the system's own biases.
It's like countering passers-by on a bike trail and concluding that the only vehicles users need are bikes.
I'm not saying vgo isn't better. But it's an unproven approach where lockfiles and constraint solving are proven, multiple times over. The burden of proof lies on vgo.
It's clear that a SAT solver is strictly stronger than the MVS approach. In other words, any MVS selection can be encoded in a SAT solver. The argument is for a reduction in power. Thus, in order to convince someone that a SAT solver is preferred over MVS, you must show examples where it succeeds when MVS fails, and the extra power is necessary. It's trivial to contrive these situations, but finding them in practice seems harder. Evidence of that happening would help change my mind, and I'd hope would help change Russ's mind.
The empirical data from Go's package ecosystem is drawn from a corpus with sophisticated package management: dep. The argument is that dep is unnecessarily powerful and that a simpler approach will suffice. The evidence supports that argument. Note that this is not an argument about Cargo, or Bundler, or any thing else. Right now, the ecosystem is using dep, and there is evidence that it can be done simpler.
To stick with your analogy, I think it's fair to conclude that the only vehicles users need on bike trails are bikes.
Additionally, I have done an analysis of two Rust projects that have been brought up in my discussions on this issue. Specifically, LALRPOP and exa. In both cases, throughout the entire history of the project (hundreds of changes over 4-5 years), Cargo only had to select the largest semver compatible version [1]. Again, I would love to find examples of projects where this strategy was not sufficient.
[1] There is one complication: in Cargo, a ^ constraint (the default kind) on a v0 dependency is only good up to the minor version. In other words, ^0.1.0 means >=0.1.0 and <0.2.0, where ^1.0.0 means >=1.0.0 and <2.0.0. Selecting the largest semver compatible version is meant in this way because of the community norms around breakage in v0. In an MVS world, any breaking change is a major version bump, and would have the same properties, but with different version strings.
I feel like this argument is focusing on the wrong thing. It doesn't matter if the entire corpus of go packages have trivial version requirements that don't require to resolve. What seems like a much more important issue is the fact that MVS literally picks different versions of dependencies. Specifically, it picks the oldest satisfiable dependency rather than the newest. And while this simplifies the algorithm, it also has the consequence that you don't get any bugfixes to packages if you haven't explicitly requested the version that includes the bugfixes.
One of the main benefits of semantic versioning is that you can upgrade packages to new minor and patchlevel versions without breaking backwards compatibility, thus allowing you to easily pick up bugfixes by simply updating your dependencies. Of course, you should still test after updating, as packages could introduce new bugs, but on the whole updating minor and patchlevel versions is far more likely to fix bugs than it is to introduce them. But MVS discards this benefit and says you cannot get bugfixes unless you're willing to manually edit your dependency list to declare that you want the newer package. The net result is that packages that use vgo are likely to end up stuck on old versions of dependencies. This is especially true for indirect dependencies. If I publish a library, my incentive is to declare the oldest version of my own dependencies that I'm compatible with, in order to give my upstream user the most control over dependency versions. But if my library's client doesn't know about my own dependencies, then this means my library's dependencies will almost certainly resolve to a really old version, and my library's client won't even know about it and so won't be in a position to request the newer, less-buggy package version. Which then means I have an incentive to instead constantly update my library to list the newest versions of my dependencies, which then forces my library's client to upgrade those dependencies even if my library's client would prefer to be conservative and not upgrade those dependencies simply because they need to upgrade my library.
The first package installer I was aware of that used a SAT solver was SUSE's, back when Yum's solver was extremely primitive and would regularly fail to find a solution.
I think using a SAT solver for package installation arose out of dealing with much more complex requirements than are likely to arise in a Go project. The Smart package installer used heuristics to find a solution depending upon the operation:
FWIW, I wrote a package installer that worked with RPMs, Solaris packages and AIX packages about 15 years ago and ended up with a minimal version selection similar to vgo. I wasn't a genius or anything... I just wasn't aware of SAT solvers at the time and it was the simplest thing that worked.
It also ignores rsc's clear algorithmic preferences. No, RE2 does not support back-references, because they require a back-tracking implementation with exponential worst-case behavior. RE2 uses an NFA with O(nm) worst-case behavior. It's a classical Unix approach where a simple implementation is preferable to having more features, especially if there are algorithmic considerations.
Throwing away dep means also throwing away the nontechnical groundwork that dep was built on - for instance, user research. Seems pretty careless to me, especially when the alternative is the product of one person's thinking on the subject done in a vacuum without the input of an entire committee of smart, reasonable people who have literally spent years diving into this specific domain.
As a user of dep, glide, godep and avid reader of vgo technical docs I favor vgo's solution. I've been a professional Go developer a few years now.
The notion that this committee speaks for the community seems a weak one to me and I've seen my view mirrored with many of my peers. Its good they attempted such research but it seems like confirmation bias as dep just seemed like a re-write of glide with similar fundamentals and a improved user experience. This appeal to authority by the committee to represent the Go community seems unproductive and unnecessarily divisive/misleading.
I wasn't thrilled when I saw become Sam elected to lead the implementation of the "official experiment" because I wasn't a fan of Glide at all. There was no community vote to oversee this, just one maintainer of one flavor of go pkg management was declared the expert and just re-wrote the existing Glide solution with some lesson's learned. Many other maintainers (experts) of other go pkg management solutions favor vgo.
I've navigated many thorny dependency problems in Go before and never have I ever been convinced that the solution was NP complete version constraints. MVS, good tooling, and finally SIV is enough to make this miles better then previous attempts in this space. Also hooray for GOPATH elimination and project based workflows I've always loved GB by Dave Cheney.
I have exactly the same feeling. And remember at the time of the start of Dep that there was no consensus and a big hope that the final integrated solution will be more Goish than Glide. Very surprised by the begin of Dep. I don't believe that rsc and the go team didn't know since the begin that it will not fit sooner or later...
Why do you think vgo wasn't built on the nontechnical research carried out by dep? In fact, that's normally what I would assume would be the purpose of something branded an "official experiment" - the code will for certain be thrown away eventually, but the experiences gained will be used to create the actual final product.
> without the input of an entire committee of smart, reasonable people who have literally spent years diving into this specific domain.
If that is the case why the solution wasn't developed already outside of Go team's ambit. After all Go team said multiple times they don't need module system as Google does not use it.
"specially when the alternative is the product of one person's thinking on the subject done in a vacuum without the input of an entire committee of smart, reasonable people who have literally spent years diving into this specific domain."
Even this very article contradicts your assertion about how it was done.
From this account, it sounds like “the community” was encouraged by “the core team” to work on, meet, discuss an idea and present work to “the core team” that was then ignored.
Would you really expect those people to be excited about the outcome?
Lots of mentions in Peter's post about things the "dep committee" may or may not have agreed with. Isn't this the same appeal to authority he is throwing at Russ? When did the "dep committee" become the gatekeepers of Go dependency discussions and solutions? Looks like a self-elected shadow government, except it didn't have a "commit bit". Someone should have burst their balloon earlier, that is the only fault here. Russ, you are at fault for leading these people on.
Go is better off with Russ's module work and I personally don't care if Sam and Peter are disgruntled.