
Go 1.5's vendor/ experiment - craigkerstiens
https://medium.com/@freeformz/go-1-5-s-vendor-experiment-fd3e830f52c3
======
breakingcups
I hate this. Go has been putting off vendoring for such a long time because
they "didn't want to come up with a half-baked solution". I think that's bull,
and it just wasn't a priority for the at the time Google-heavy team since they
have a different dependency need (fueled partly by the NIH-syndrome).

Now, they release this half-baked solution which has no true advantages over
the existing vendoring tools other than keeping our import path the same in
your source file (which just makes things more confusing imho). No solution
for dealing with dependency conflicts, not even tools to detect and manage
them other than the compiler itself.

On top of that there's even more "magic keywords" that you need to learn which
causes stuff not to behave as you expect, just like the magic comments.

I understand it is also a consequence of having a v1.0 and a backwards-
compatibility guarantee, but nearly all features added after 1.0 seem to be
added with as minimal effort as needed to get _something_ working, which is
not a good thing. It all feels tacked on, the easy way, and not thought out.

~~~
justthistime_
Looks like they are trying to exhaust all possible bad ideas, before picking
the solution that works.

I expect that they don't even know what works yet, because NIH and "it didn't
exist in the 1970ies".

~~~
vanessa98
Go packaging is a black hole of sadness in light of all the experience and
research that's gone on in this space.

------
roskilli
> Multiple versions of the same library will cause problems ... This is a
> limitation of ... but probably also a sane one

> Another issue (hat tip to onetruekarl) comes up when 2 vendored dependencies
> also vendor their own copies of another depencency with the same name ...

So I still don't see why people are content with having a vendoring solution
that doesn't support multiple versions of the same package and strict ASCII
name conflicts. I understand this is an experiment but it begs the question
why it doesn't want to at least eventually bite off doing powerful things (it
seems to handwave the hard problems with a future spec that kind might solve
it - seemingly something probably orthogonal to this experiment).

I have really enjoyed what NPM did with a simple package.json file and
enabling recursive dependencies. It seems like if go were to add a simple file
that maps package name to package URL with a decent support for different URL
protocols you could build the same recursive dependency graph and depend on
multiple versions of the same package and also different packages with the
same name.

~~~
nulltype
I think it does support multiple versions of a package, as long as the package
supports multiple versions of a package. The problem is if a package registers
some global name, and another copy of the package also registers the same
global name, that could cause an error.

The name conflict is a compiler error message, not anything to do with
vendoring.

Having a package.json file is a separate issue, but there is a proposal for
that here: [https://groups.google.com/forum/#!msg/golang-
dev/nMWoEAG55v8...](https://groups.google.com/forum/#!msg/golang-
dev/nMWoEAG55v8/iJGgur7W_SEJ)

Having a file like that doesn't solve the vendoring issues though. This new
vendoring scheme seems almost identical to NPM's node_modules.

~~~
a13xb
_> The problem is if a package registers some global name, and another copy of
the package also registers the same global name, that could cause an error._

This shouldn't cause an error, because packages can only register globals
within their own namespace. You just end up with two copies of it, with
different types.

~~~
nulltype
That would be nice, but that is not true in Go. You can register with a
function like this one:
[http://golang.org/pkg/database/sql/#Register](http://golang.org/pkg/database/sql/#Register)
(mentioned in the article) and run it in your init() function or any similar
function that a client calls.

That will collide with any other package that also uses that registration
function for the same value of 'name'.

~~~
a13xb
Right, when you leak handles to package objects externally, all bets are off.

------
Nitramp
This is similar to node_modules + npm, in essence.

Having used node and npm on both small (~200 transitive deps) and large (~1000
transitive deps) projects, I think this is an approach that's conceptually not
a good idea (outside of possible implementation issues with npm).

With vendoring, end up with a _very_ large tree of transitive vendor'ed
dependencies. You have no idea or control over what's in it, and what versions
of what packages are in it.

Nearly all of the time, the two allegedly incompatible versions of a library
that you install separately turn out to be 2.0.13 and 2.0.14, and you only get
two because well, somebody hasn't upgraded yet.

This might seem trivial, but in the end you have a project tree where every
dependency gets installed separately for everything that depends on it, so you
end up with something like O(number of deps * number of deps) installed
libraries, which obviously doesn't scale.

You also end up with the versioning problem. Imagine there's an important
reason to upgrade some package P, e.g. a security issue. Now you have to hunt
down every single project in your transitive dependencies that depends on it,
and ask them to upgrade or fork.

You also still have version issues. If you transitively depend on the same
library C in versions Cv1 and Cv2, and you want to pass an object/data
structure/... from Cv1 to Cv2 through some path, you open yourself up to all
kinds of hilarious version mismatches and failures.

This isn't new - e.g. Eclipse/OSGi did the same thing with separated class
loaders, and it was a mess.

I think flattening your dependency/vendor tree to only ever have one version
of library C, and hoping/testing for minor version mismatches to work out, is
the most viable thing to do.

~~~
ansible
The mechanism provided by the go tool is just for compilation. There will be
other tools you will use to actually manage the dependencies. So you may
choose to flatten out the dependencies, but maybe keep a couple separate for
special cases.

~~~
Nitramp
Absolutely, my critique was just about recursive vendoring.

------
mmgutz
Node.js (w/ NPM) is dynamic in nature and its modules are stored in the module
cache by an absolute path key calculated from the relative path of the module
which is being required at RUN-TIME. Go cannot do that at RUN-TIME. It must
resolve all dependencies at compile time.

Are there any statically typed languages that solve the diamond dependency
issue elegantly?

~~~
teacup50
Java, by prioritizing API compatibility across releases and having compiler-
checked types.

The result being that unless you hit a particularly crappy piece of software,
it's save to just pick the later version of two dependencies and use that.

~~~
grogenaut
lol, you mean like the vm itself with different xml parsers between 1.4, 1.5,
1.6, and other built in restrictions in other areas. Also with allowing and
then disallowing manipulation of certain namespaces (esp the core ones).

It's a problem a human has to get involved with at some point no matter what.

~~~
teacup50
Anecdotal exceptions aren't proof of universal truth.

------
bbrazil
> There is a strong argument to be made that libraries should not vendor their
> own dependencies, but as the author of several library / command combo
> packages that rely extensively on 3rd party libraries I’m not convinced of
> that blanket assertion.

We ran into this with prometheus.io and godep. Vendoring in our libraries
meant that types didn't match between the vendored version that the main
server had and the vendored version coming form the library.

If there's a way to solve this, that'd be great but it means you're back to
choosing which version is correct.

------
lukasm
That's going to be bad on Windows. 260 char limit for path.

------
rcarmo
I've done this sort of thing for a while (if in a simpler, more naive way) by
just tweaking GOPATH in a Makefile. In fact, I had to do it again only
yesterday, for a very simple little server: [https://github.com/rcarmo/go-
http-mdns](https://github.com/rcarmo/go-http-mdns)

In general, I think a solution for this is long overdue, and look forward to
trying out 1.5 in earnest.

------
cledet
Correct me if I'm wrong but doesn't remote packages have the same side-effects
as this experimental vendoring?

~~~
nulltype
What are remote packages?

~~~
cledet
When you run `go get` and import packages from a remote Git or Mercurial
repository. At least that's what Go docs call it[1].

[1]:
[https://golang.org/doc/code.html#remote](https://golang.org/doc/code.html#remote)

~~~
nulltype
If by "same side effects" you mean multiple versions of the same library
causing issues, I don't think so. Using "go get" you will end up with only a
single version of a library for a given path.

~~~
cledet
Almost, I mean you can only use a single version at a given time. An example
would be if an imported library and its importer uses the same dependencies
but one used an older API.

~~~
nulltype
I think that is solved by the vendor thing. You can have two versions of the
dependency, e.g. "vendor/dependency" and "vendor/library/vendor/dependency"
and, with the caveats mentioned elsewhere (global name conflicts, similar-
looking but different types, etc) it should work fine.

