
The Principles of Versioning in Go - crawshaw
https://research.swtch.com/vgo-principles
======
nyc640
I understand why this rubs some people the wrong way, but it really is
consistent with Go's vision since the start. I don't think Russ is trying to
imply that the issues they are facing when designing the language (dependency
management, generics, etc.) are unique or impossible to solve in any way. I
think he is simply saying that they are complex problems whose solutions
introduce complexity to the language and ecosystem, and when presented with
such a trade-off, the Go team has always preferred to lean towards simplicity,
even if it means not implementing a feature that some (or even many)
developers want/need.

While I don't personally agree with many of the decisions they've made, I
still respect the choices they've made and I think there is plenty of room for
languages with competing philosophies. Because of Go's philosophy of not
catering to every demand, it's never going to become the next Java or take
over all of Software Engineering, but I think the Go team is fine with that as
long as it continues to be useful as a tool for solving problems.

~~~
ilovecaching
C is a simple language and yet it constitutes the guts of most computers on
this planet and Mars. Simple works.

Go is really close to C. In fact, Rob Pike's Go CLI programs on his Github
looks pretty much like a standard C program using getopt. Interfaces are just
a nicer virtual function table. interface{} is just void *. In that sense,
it's already treading ground that is proven to work. Go didn't have to do
anything revolutionary because C was already great.

I hope the Go team will continue to embody what made C great so the next
generation can receive the lessons its taught us.

~~~
pjmlp
Thanks to UNIX adoption.

C is the JavaScript of systems programming.

Had it not been for UNIX workstations and universities adopting it, we would
probably using BLISS, PL/I or something else instead and not bleeding billions
of dollars per year fixing security exploits.

~~~
sacado2
Maybe we would be using oberon. But then, go owes as much to oberon as it owes
to C.

~~~
pjmlp
Oberon was good for its time back in early 90's, too little for modern
computing.

We would be better served with Modula-3 or Active Oberon, which Go failed to
learn from.

------
Groxx
A very large issue with go tooling around dependency management is that it's
based around concepts like this:

> _If an old package and a new package have the same import path, the new
> package must be backwards compatible with the old package._

Which are effectively fine conceptually. Every dependency system also supports
that, you just make a new project named "thing-v2" instead of bumping a major
version. Making it explicit and encouraging it when possible (instead of
forcing breaking changes) is good.

The problem is that there's absolutely nothing to help you achieve that. Or
detect failures to achieve it. Or anything. You can't even reliably rename _a
single package_ with existing refactoring tools, good luck copying and
refactoring most of an entire repository correctly. If you do it wrong, you
might have to do it all over again because it's probably another breaking
change.

Without fairly strong tooling to push people towards correct behavior, "every
change is breaking" is _the default behavior_. And the frequent accidental
behavior even if you're being careful.

~~~
ainar-g
> The problem is that there's absolutely nothing to help you achieve that. (…)

There were talks about "go mod release" from the very beginning, but you are
currently correct here.

> (…) You can't even reliably rename a single package with existing
> refactoring tools. (…)

Strictly false. There is gomvpkg[1]. The absolutely ironic thing is that it's
still stuck in the GOPATH mode, despite being a part of the official x/tools
repo, but it's not “absolutely nothing”.

[1]
[https://godoc.org/golang.org/x/tools/cmd/gomvpkg](https://godoc.org/golang.org/x/tools/cmd/gomvpkg)

> Without fairly strong tooling to push people towards correct behavior,
> "every change is breaking" is _the default behavior._ And the frequent
> accidental behavior even if you're being careful.

I agree. Honestly, the slow progress in moving the community's favourite
tools, including gomvpkg, to work in the new ecosystem is a massive failure on
the Go authors' side. That just means that they haven't made the move easy and
simple enough.

~~~
Groxx
> _Strictly false. There is gomvpkg[1]. The absolutely ironic thing is that it
> 's still stuck in the GOPATH mode, despite being a part of the official
> x/tools repo, but it's not “absolutely nothing”._

IMO renaming packages is a necessary prerequisite, but otherwise orthogonal to
handling breaking changes. Personally I don't include it in the "something
that could help" group.

Either way, it's _wildly_ unreliable in my experience, so that point stands.

~~~
ainar-g
I've used gomvpkg several times during refactorings and rewrites of orphaned
code and never had an issue with it (in GOPATH mode that is). But oh well,
YMMV, I guess.

~~~
Groxx
By "in GOPATH mode" do you mean it only looks at your GOPATH, not vendor? If
so, that could be (part of) the issue - my GOPATH is definitely not up to date
with whatever project I'm working on.

Also that'd mean it's a few years behind even basics like the vendor folder,
so I'd be comfortable putting it in an "abandonware, effectively does not
exist" category.

~~~
ainar-g
I've never had the need to rename anything in vendor/, so I can't really
answer here. I've only renamed stuff in my project's internal/. Can I ask you
why did you feel the need to rename something in your vendor/? It feels
completely unnecessary and counter-intuitive to me.

~~~
Groxx
Sorry, I meant "does it only try to use my GOPATH for imports while type-
checking my source". My dependencies are in ./vendor, so ignoring it would
lead to lots of problems.

Renaming stuff in vendor: almost never. It could be useful for handling
library updates though, if it worked and were possible: you could refactor it
to the v2 path, automatically updating your code, and then pull the new update
for real and be mostly done (for minor-but-breaking changes). I've done
similar things in other systems for library updates, since there's usually
zero automated migration help for library _users_ , only _authors_ , despite
users greatly outnumbering the authors.

~~~
ainar-g
Ah, I see. As far as I know it is vendor-directory-aware. At least it always
renamed my packages that imported stuff from vendor/ with no problems.

I don't deny that you could have issues with the tool, I simply have never
encountered any myself.

------
shadowgovt
Spending my time as of late in npm's particular flavor of DLL hell, I
appreciate the Go team taking a stab at this hard and persistent problem. I
agree with other commenters' observations that the tooling hasn't caught up
with the philosophy, but the philosophy seems on-point (and the key
observation that as the dependencies of a project approach infinity, the odds
of two branches of a dependency tree wanting to rely on two different versions
of the same library approach 100% and there should be a sane way to address
that need... npm does a 99% decent job of addressing this, but I've definitely
hit issues with npm where objects generated by a newer dependency version have
leaked into a codepath where an older version is calling the shots, and it's
an absolute hellscape to debug).

~~~
kortex
Npm seems to lack, in the context of the article, the _ecosystem_ and
_community_ aspect. It's all like herding cats. Case in point, leftpad.

* > as the dependencies of a project approach infinity, the odds of two branches of a dependency tree wanting to rely on two different versions of the same library approach 100% *

With experience, I've found that this is precisely the case, with bounds much,
much smaller than infinity. The ersatz standard lib can't even standardize,
there's underscore and lodash. I think eventually the community will Ostwalt
ripen in a sense, small packages will dissolve by attrition and big ones will
grow. But the argument is strong for simply decreasing n = number of discrete
packages.

------
pcwalton
The basic issue with this criticism of Cargo is that the fact that updating
dependencies can cause breakage _never actually goes away_. The package
management tool either makes this problem easy, or it makes it hard. The
approach Cargo chooses is to make "cargo update" more likely to work by
allowing library authors to specify incompatibilities. This can go wrong in
edge cases. But that's inherent to the problem itself. Minimum version
selection only seems to work because it doesn't actually solve the problem of
keeping dependencies up to date. The moment you want to do that, you're stuck
with "go get -u", which is just a worse version of "SAT solving" because it
doesn't know how to detect and avoid incompatibilities.

(As an aside, I don't like the term "SAT solving" for the problem of package
version selection. By focusing on an implementation detail, it makes the
problem seem scarier than it actually is. Register allocation is NP hard too,
but nobody calls the register allocator "the SAT solver".)

~~~
sansnomme
I think calling it a SAT solver is appropriate here. A lot of package managers
were "organically grown" i.e. they were created in response to a problem that
arose. Not much thought was given to the underlying problem and in the long
term users suffer with e.g. dependencies taking forever to install and strange
deadlocks. Recognizing it as a SAT problem from the start means we can easily
throw Z3 or similar at it instead of manually adding heuristics everytime a
user files a GitHub issue. (See Bundler and other pre-2018/2017 package
management tools for example) Also, it's not always a SAT problem. For npm,
Java Classloaders it's just tree traversal because conflicting sub
dependencies of a dependency can co-exist.

That said, the Go's team (and to a lesser extent Python's too) dropped the
ball here. None of the challenges of dependency management was unknown or
unsolved. Anyone who has used e.g. Maven should be more than familiar. (A 11
section blog post is indeed nice though, good for onboarding new programmers I
suppose) If developer experience was considered a first class priority, this
could have been solved from day 0.

(My personal favorite is NPM since conflicting dependencies can co-exist at
the expense of greater disk usage, but memory is cheap these days. No one with
an electron app on their desktop has any right to complain about the memory
usage of plain text source code.)

~~~
pcwalton
See, that's the reason why calling it a SAT solver makes the problem sound
scarier than it is. Using Z3--or even MiniSAT--is massive overkill. The naive
exponential algorithm is fine, and it leads to more maintainable and
understandable code. Rust's build performance has nothing to do with
dependency solving. I've never heard of a package manager in which core
dependency solving was a speed problem; dep was slow because of I/O, not SAT
solving.

~~~
ainar-g
> The naive exponential algorithm is fine (…)

Define “fine”. In my experience, Go with modules[1] solves a project's
dependencies _exponentially_ more quickly than dep. And that is especially
noticeable on big projects with three or more dozens of dependencies. I don't
mean any disrespect to the people behind dep, and the whole debacle was a
massive miscommunication disaster, but go mod is just quicker.

[1] While we're at it, I'm still low-key mad that the Go authors called
modules packages and packages modules. Gah!

~~~
pcwalton
Again, that's because of I/O. The CPU bound part of the algorithm is very
fast, as Cargo demonstrates.

~~~
ainar-g
I don't get it. The number of dependencies is constant. Which means that both
of them need to make approximately the same amount of network requests. Where
does the I/O difference come from then?

Also, since you've mentioned Cargo. Its slowness actually used to be one of
the main pain points about which the Rust people at one of the companies I've
worked for in the past complained a lot during the water cooler discussions.
Granted, that was a couple of years ago, and Cargo has probably made progress
since then.

~~~
pcwalton
As I recall, dep had to parse Go code over and over during the dependency
resolution process instead of caching it.

And any cargo slowness is likely the fault of rustc or I/O. It's received a
lot of profiling work and the core dependency solver has never been a
performance issue to my knowledge.

~~~
steveklabnik
It would seem this is correct. From the docs:
[https://github.com/golang/dep/blob/master/docs/FAQ.md#why-
is...](https://github.com/golang/dep/blob/master/docs/FAQ.md#why-is-dep-slow)

There are two things that really slow dep down. One is unavoidable; for the
other, we have a plan.

The unavoidable part is the initial clone. ... Fortunately, this is just an
initial clone - pay it once, and you're done.

The other part is the work of retrieving information about dependencies. There
are three parts to this:

* Getting an up-to-date list of versions from the upstream source

* Reading the Gopkg.toml for a particular version out of the local cache

* Parsing the tree of packages for import statements at a particular version The first requires one or more network calls; the second two usually mean something like a git checkout, and the third is a filesystem walk, plus loading and parsing .go files. All of these are expensive operations.

\----------------

For context, in comparison, Cargo has the same #1 problem. But for #2, the
information on all dependencies is stored in the index itself; this means that
Cargo can figure out what dependencies you need without any network calls at
all, let alone downloading, git checkout, and filesystem walk.

I do not know how go mod compares, off the top of my head.

------
mfer
Side note, I just noticed that [https://godoc.org](https://godoc.org) does not
support Go modules with major versions being incremented (e.g., ending in
`/v2`). The new pkg.go.dev does but not the existing godoc.org.

This echos some of the other observations of tooling not being easily updated
and still not being there, yet.

~~~
BarkMore
pkg.go.dev is the module aware replacement for godoc.org.

~~~
mfer
The tools, habits, and existing links haven't been updated. godoc.org don't
point people to the new site. The "replacement" experience is a little broken.

------
mfer
> The answer is that we learned from Dep that the general Bundler/Cargo/Dep
> approach includes some decisions that make software engineering more complex
> and more challenging.

Are those documented anywhere? Are they detailed, discussed, or otherwise
dealt with. The people behind those package manages have a significant amount
of experience with dependency management including many real world situations.
It would be good to have some public discussions on this for everyone's
benefit if this is really true.

> Principle #1: Compatibility

The whole section on compatibility doesn't sit right...

> It is intended that programs written to the Go 1 specification will continue
> to compile and run correctly, unchanged, over the lifetime of that
> specification. Go programs that work today should continue to work even as
> future “point” releases of Go 1 arise (Go 1.1, Go 1.2, etc.).

This doesn't seem to apply to the tooling. For example, `go get` has a new
behavior since the introduction of modules. The behavior changed. So, the
compatibility idea only extends so far. Is that hypocritical as an idea
because people use the `go` CLI for scripting and the behavior of it changed.

> What does compatibility have to do with versioning? It’s important to think
> about compatibility because the most popular approach to versioning
> today—semantic versioning—instead encourages incompatibility. That is,
> semantic versioning has the unfortunate effect of making incompatible
> changes seem easy.

Incompatible changes happen. How many long lived pieces of software have
needed to make incompatible changes while supporting people through that
process. Do you rewrite? Do you copy the whole codebase to a new sub-directory
and add more there? This is what the Go project has started to recommend. If
you do that you loose history and some projects move fast (look at Kubernetes
client-go).

Tools like npm, cargo, and dep provide a means to stay at older versions if
that's what you want. The choice is up to the consumer of the package. With Go
the choice is now up to the Go team by limiting choice. Is that a good thing?
Not if you don't like the choice.

~~~
jerf
"This doesn't seem to apply to the tooling. For example, `go get` has a new
behavior since the introduction of modules.... Is that hypocritical as an idea
because people use the `go` CLI for scripting and the behavior of it changed."

It does not apply to tooling. It is not hypocritical because it was never
claimed that it applies to tooling. It has always been a promise about code.

~~~
fanf2
Build systems are code.

~~~
jerf
That's not the question. The question is, what was promised, not what can you
can you twist their promise into if you choose different definitions than they
do. They have only ever promised that code written in Go 1.0 will be forwards-
compatible, not that the build systems will all be the same.

When I promise my kid that I'm "taking them to the park" this weekend, they
have no grounds to complain when I don't take them to a full-fledged amusement
park just because "taking them to the park" would technically cover that too.

------
mfer
Both Red Monk and TIOBE [1] are noting a decrease in the popularity of Go. Red
Monk wondered aloud if this had to do with the way the Go modules / dep
situation happened [2].

I wonder if the posts lately are an attempt to win people over to their
thinking or to win back popularity.

[1] [https://www.tiobe.com/tiobe-index/go/](https://www.tiobe.com/tiobe-
index/go/)

[2] [https://redmonk.com/sogrady/2019/07/18/language-
rankings-6-1...](https://redmonk.com/sogrady/2019/07/18/language-
rankings-6-19/)

~~~
jerf
Tiobe fluctuates wildly, and it isn't really plausible that it plummeted as
far as it did on Tiobe in just one month. The chatter on HN will tend to
amplify problems beyond what the general public actually feels the correct
level is, and even on HN this hasn't exactly been a big topic. (When generic
Go topics discuss "what's wrong with Go" this is not generally something that
is mentioned.) It seems far more likely that, yet again (this isn't exactly a
new thing with Tiobe), the search engines did "something" and the searches
changed again. Many, if not most, of the programming language names are
susceptible to that, between things that are too short to search for like C
and terms of general interest like Python, but Go really has to be the worst.

Ten-year-old+ programming languages with at the very least tens of thousands
of users (I see "millions" claimed, I find myself a bit skeptical, but
maybe... conservatively tens of thousands for sure, though) do not lose 33% of
their users in a _month_. That'd be somewhere in 2-3 orders of magnitude
faster than Perl 5 has declined. Implausible.

~~~
mfer
TIOBE notes that it is looking at popularity. This is different than use. Go
didn't loose a bunch of users. In the ways TIOBE measures popularity... it
lost some popularity.

~~~
jerf
It did not lose one third of its "popularity" in a _month_ , either.

You're putting a "just so" story on top of noise. Nor is this special pleading
on my part; "Tiobe is mostly noise" has been my position for years, along with
a lot of other people. Signal probably exceeds noise for maybe the first 3 or
4 or maybe 5 slots, after that it has a history demonstrably absurd claims
about the popularity of things like REXX, which spent many years in the so-
called "top 20", which is just ridiculous.

------
sagichmal
_> If an old package and a new package have the same import path, the new
package must be backwards compatible with the old package._

Simply put, this rule is too strict, and biases modules' UX and semantics too
far toward consumers.

Not all packages are so mature, and consumed by enough consumers, that they
should be forced by the language tooling to maintain backwards compatibility.
That requirement creates a great deal of pointless work for the majority,
perhaps vast majority, of package authors, who write small and uncertain
packages for use by themselves or a small number of their immediate
colleagues. For these authors, strict backwards compatibility is not only
unnecessary, it's actually undesirable, as it creates needless toil.

 _> The answer is that we learned from Dep that the general Bundler/Cargo/Dep
approach includes some decisions that make software engineering more complex
and more challenging._

For a minority of use cases, where packages are large and mature and imported
by many, many consumers -- sure, maybe. But the vast majority of Go code,
written by private organizations and used internally, doesn't fit this
description.

~~~
IshKebab
Nothing's physically _stopping_ you breaking backwards compatibility without
using a new import path if you want to. You might piss off your users but if
they don't care then go for it.

~~~
grey-area
Doesn’t the go tooling mandate this if you are >= v2.0?

The rest of mod looks fine but tying paths to major version changes was a
mistake IMO.

------
tandr
(Kind of meta-venting out below, also known as rant.)

1\. I don't know why I find Russ's writings are so irksome, even unpleasant to
read, sorry. (Please help me on that, maybe I am in "questioning authority"
mode and cannot flip out of it.) I cannot shake that feeling of being
patronized all the time without being asked. The whole article, no - bunch of
his posts, - reads like "we know better". Is it a Google thing, or heavy Rob
Pike's influence?

Yes, he is bringing some good points, and some very questionable ones. And for
the questionable one (Compatibility section for example, or SAT solver
question (nobody did it before?), or version naming, or paths for the sub-
components...) it feels like he is trying to push explanation about his (and
his team?) reasons and motivations behind these decisions, but... it is a
monologue. There are more opinions and whole wide world of different software
shops with own ideas and practices.

2\. Go2 was a promised land, that later translated into "incremental, no code
breaking" approach. Nowadays it feels like they are not even talking about it
anymore!

Looking at general history of programming languages, pretty much every
language has had it's "C++" or Modula-2 moments. It is a natural process - an
evolution - when unneeded parts disappear, and useful ideas added/extended.
Amount of gotchas in Go is not zero, and to me (YMMV) it feels like Go2 would
be (was?) a good time to shake off under-thought items, and enforce some new
rules. Maybe even add interoperability between .go and .go2 files, maybe even
semi-automatic upgrade path to run `go2 fmt ` on old files a la to Kotlin from
Java - I don't know. But it feels like an opportunity and momentum for a
"peaceful revolution" is lost.

3\. Even with non-breaking changes style of change they have managed to break
stuff. I, for one, still sit on 1.12 branch - because when I tried to upgrade
to 1.13 (twice now), and our pure-Go code just plainly did not build with some
obscure messages (I think it was linking or module cache errors). So, why
pretend? Code rot is unavoidable - do we like it or not.

99\. There are couple more points about chilling effect of "we know better" on
a community as whole. (One example would be this whole "error" debacle that
visible reduced activity on issue tracker and golang-nuts mailing lists after
"hey all, we are going this way because we already went there". Immediately
after it felt like people just gave up and left. "Empty town square" feeling.)
It would be interesting to see this year results of "the state of Go" survey,
but I don't think we will see a worrisome drops there -- the amount of new
people coming in would easily override any opinions of people leaving.

/end of rant, sorry for a long post

~~~
IshKebab
I think you're wrong on all counts (or I have no idea what you're trying to
say).

> nobody did it before?

As far as I know no other dependency system is the same as Go's.

> "we know better"

I mean, it is Rob Pike. How successful do you have to be before you get to say
I know better? In this case it _is_ better.

> Nowadays it feels like they are not even talking about Go 2 anymore!

This very post is about Go 2 (in the sense that there was ever going to be a
"Go 2").

> pretty much every language has had it's "C++" or Modula-2 moments. It is a
> natural process - an evolution - when unneeded parts disappear,

I can count on one finger the number of parts that have disappeared from C++.

~~~
mfer
Go modules are in the post Rob Pike era. Russ Cox is how the sole lead of Go
and this was his project

