Hacker News new | past | comments | ask | show | jobs | submit login

This is horrible advice.

I work in a lot of languages (including Go), which gives me some perspective. On the extreme ends of this issue we have Maven, which specifies hard version numbers (fuzzing is possible but culturally taboo) and NPM, which only recently acquired the ability to lock version numbers and still has a strong culture of "always grab latest".

The Maven approach is unquestionably superior; code always builds every time. If you walk away from an NPM build for nine months, the chance that it will build (let alone work as before) is vanishingly small. I waste stupid amounts of time fighting with upstream dependencies rather than delivering business value.

People break library interfaces, that's just a fact of life. The only question is whether they will break your code when you explicitly decide to bump a version, or when randomly when someone (possibly not even you) makes changes to other parts of the system.




> The Maven approach is unquestionably superior; code always builds every time.

If that's your goal. There's a middle ground which includes security updates: asking the community to follow semver. Rust devs seem to do it well and my crates build well with non-exact version numbers after not visiting them for a while. Not sure why the majority of Rust devs do a good job with this and JS devs don't. I suppose it's the strictness of the contract in the first place.

With Go, I've found popular library maintainers doing a good job at BC contract compat even when there has been only one blessed version. I don't assume giving them a bit more flexibility is going to hurt anything.


> There's a middle ground which includes security updates: asking the community to follow semver.

No, that doesn't work. People make mistakes, and you end up not being able to build software. It might work _most_ of the time, but when things break, it's of course always at the worst possible time.

I think Cargo got it right: use semver, but also have lock files, so you can build with _exactly_ the same dependencies later.


Sorry for the confusion, but that's what I meant. I was addressing the use-exact-versions-all-the-time argument. Maven goes too far in that it is taboo to do anything but use specific versions. Cargo, composer, etc do follow the proper approach. Maven w/ version numbers like "[1.2,1.3)" shouldn't be so taboo IMO. The "commit lock files for apps and don't for libs" is also practical.


Ah, OK, we're in agreement then :)


With strong typing, you can analyze the code and automatically increment the semver versions based on public API changes. Elm does this AFAIK. Reduces the amount of mistakes that are possible.


> With strong typing, you can analyze the code and automatically increment the semver versions based on public API changes.

This is nice, but should be noted does not absolutely prevent missing backward-incompatible changes; that a function has the same signature does not mean that it has backward compatible behavior.

(With a rich enough strong, static type system, used well to start with, you might hope that all meaningful behavior changes would manifest as type changes as well, but I don't think its clear that that would actually be the case even in a perfect world, and in any case, its unlikely to be the case in real use even if it would be in the ideal case.)


Yup, elm pioneered this space. We're giving it a shot too https://github.com/rust-lang-nursery/rust-semverver


Rust actually uses lock files, just not for libraries.

I've been bit by this a few times when I've come back to code 2-3 months later and forgot to include the .cargolock so I pretty much always use the lock file these days.


Lock files are best of both worlds: you specify the latest at check in time and then freeze whatever was picked. No need to reinvent the world.


Yeah it's crazy that there's still so much controversy around this topic considering Node and Ruby have had amazing dependency management for over half a decade at this point. Dependency management in those languages is pretty much a solved problem and the fact that go isn't there yet drives me nuts since I have to work with it every day for my job.


Node and Ruby have had amazing dependency management for over half a decade at this point.

Eh?

NPM didn't have package-lock.json until v5, released in 2017. Before then there was the optional shrinkwrap that nobody used, so builds were totally unreproducible.

Ruby at least had Gemfile.lock from early days. Unfortunately there have been so many compatibility problems with different versions of Ruby itself that someone needed to invent rvm, rbenv, and chruby. Getting every dependency to behave in the same Ruby version was sometimes an odyssey. Still, at least builds are reproducible... as long as you're running on the same OS/CPU arch (uh oh native code!)

Ruby is actually pretty alright given the constraints, but Node/NPM is the canonical example of how NOT to do dependency management, and they're still trying to figure out here in 2018.


In my experience NPM shows exactly how to build a package manager. They are slowly fixing the problems one by one but half a decade ago NPM was terrible compared to what it is today.


Well, the ways Node and Ruby "solve" this is are almost diametrically opposed, so I don't think it makes sense to call this a solved problem.


Indeed this thread has caused me to retreat back into my Ruby hole. Seems like any contemporary solutions should be at least as good as a Gemfile/.lock.


Bundler has been out since 2008. Just saying.


Node breaks on me all the time.


The picture becomes more complicated when you consider a hierarchy of packages, because presumably the packages you depend on would have their own lock files, which represent the versions those packages have been tested with.

npm will pick the latest version of the dependencies compatible with the package.json of the packages (package-lock.json, is not published as part of the package). This means that with npm we may end up using a version of a transitive dependency which is much later than than the one that your direct dependency was tested with.

The proposal for vgo will take the minimum version compatible with the packages, and hence pick a version which is closer to the actually tested one.


This proposed “minimum version” behavior will effectively prevent “automated” security updates, which is what most reasonable people expect from their package manager.

Consider all tens of thousands of CVE bugs found in widely used image, video, XML, etc. libraries. Even “updated” software is often compromised via outdated libraries.

With a “min version” approach, none of those will get patched without explicit action by a developer, who won’t do that because he doesn’t even know about the bug unless he reads the thousands of messages each day on bugtrak.

If anything, history has shown that we should be focusing package management on getting security fixes to the the end user as soon as possible.

This proposal is bad for security.


Note that even currently, lockfiles have the effect of locking you to an old version of the software. You could choose to automatically update all dependencies to latest in CI, but then by that token you can run vgo get -u, to get the latest versions here as well.

One of the issues which the vgo developers point out is that the "latest" behaviour has the perverse effect that a module A may declare a dependency of version 1.1 of a module B, and may have never been even tested on that version because the package manager always picked the latest.

In some sense, the vgo approach is saying that across hierarchy of module owners, each one should keep upgrading their manifest to the latest versions of their direct dependencies, and ensure that things keep working, rather than relying on the topmost module to pull in the latest of the entire tree of dependencies. This seems to be good for the entire ecosystem.


> each one should keep upgrading their manifest to the latest versions of their direct dependencies, and ensure that things keep working

But that’s the problem. You’re relying on N people, including the package maintainers and end developers to all take timely action in order to get a security fix to the end user.

That simply won’t happen.

What should happen is a single package maintainer fixes a vulnerability, and that fix automatically flows through the system to all places where that package is used. And insecure versions should be made unavailable or throw a critical “I won’t build” error.

Perhaps some way of marking versions as security critical might help, but the proposed approach will leave tons of vulnerable libraries in the wild.

All the current package managers for other languages have this issue to some degree. Golang should do better with knowledge of those mistakes.


PHP package manager, Composer, uses lock files only for applications, not for libraries. The developer makes sure that everything works and then commits the lock file.


> People break library interfaces, that's just a fact of life.

Sure, but in my experience it happens at least 10x as frequently in NPM than with Go. It's really common for me to update all my Go dependencies and everything just continues to work. With NPM I have to start looking for migration documentation and change a bunch of my code.


For me, it was difficult that I couldn't have reproducible builds. I would kick off a deploy and it would one day break due to some dependency (incorrectly) making a breaking change. It's important to keep packages up to date, but I believe it should be a conscious decision of the developer.


I also think reproducable builds are very important. Vendoring has solved this for me in Go even before it was officially supported.

My main fear is that when people get used to the idea that their master repo doesn't have to have a backwards compatible interface, then updating dependencies will look awfully similar to NPM where authors change interfaces based on their weekly mood.


> I waste stupid amounts of time fighting with upstream dependencies rather than delivering business value.

Have you paid the upstream developers? Because you sound like you did.


Maven doesn't specify hard version numbers - the default meaning of "1.0" is >= 1.0, and some packages specify dependencies without a version at all. It's quite possible to get updates of transitive dependencies without realising it (especially since most tooling does not encourage pinning any dependency other than the superficial ones).




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: