
Announcing the new Rust package manager, Cargo - nnutter
https://mail.mozilla.org/pipermail/rust-dev/2014-March/009090.html
======
jeremymcanally
Now everyone can know the joys and horrors of Bundler! ;) I kid. This
announcement is very exciting for people tinkering with Rust like myself.

I really like Go, but the lack of a solid package management solution _that I
like using_ is a downer. Before you rage-comment: I know how Go packages work.
I know you think they're better than anything that's ever been invented. I
know there are solutions out there for some portions of what I want. But
nothing has been created that really works for me (yet). So, to see this
development in another one of my "tinker with it but not build a ton of
production quality code just yet" languages is exciting!

~~~
chimeracoder
> I really like Go, but the lack of a solid package management solution that I
> like using is a downer. Before you rage-comment: I know how Go packages
> work. I know you think they're better than anything that's ever been
> invented. I know there are solutions out there for some portions of what I
> want. But nothing has been created that really works for me (yet).

What is it that you _do_ want in a Go "package manager", then?

You say that you know how Go packages work, so I'm assuming it's not one of
the typical misunderstandings of newcomers from the Python/Ruby/Node.js world
( _all_ 1.x code is forwards-compatible with no modifications, static linking
means makes specifying versions of packages less relevant, the
package/filesystem layout parallel means that vendoring project-specific forks
of packages is relatively straightforward once you know how, etc.)

I put "package manager" in scare quotation marks because one of the design
goals of Go is essentially to render most of the functionality of such package
managers irrelevant.

I say this as someone who has been writing Go for work on a daily basis for
over a year and half: while Go's package system isn't perfect, it's pretty
damn good, and I haven't felt the need for a package manager at all ever since
I learned the way things worked.

~~~
frio
Version pinning. I don't want a package manager, but I _do_ want to be able to
say

    
    
        import (
            "github.com/frio/mycoollibrary@1.2"
        )
    

... or some such (where the tag above could be a plain old hash too).

Vendoring in every library I use is not a great solution.

~~~
burntsushi
You can get something like this already with gopkg.in:
[http://godoc.org/gopkg.in/v1/docs#hdr-
Supported_URLs](http://godoc.org/gopkg.in/v1/docs#hdr-Supported_URLs) (it uses
tags or branch names).

It should work with any GitHub repo that's using tags/branches with names
conforming to semantic versioning.

~~~
grey-area
That looks nice. Does it work with import statements though or is it only for
go get?

~~~
burntsushi
It has to work with import statements---that's the whole point. :-)

~~~
grey-area
Thanks for the link. Just tried it out, nice, and I love the use of semver. So
you can do this to import a specific version of say yaml

    
    
        import "gopkg.in/v1/yaml"
    

or for an arbitrary url on github

    
    
        https://gopkg.in/username/v1.0.0/pkg
    

Weird that versions are before the pkg name though, that doesn't seem right...

Here's hoping this sort of versioning scheme makes it into the go toolchain
eventually. The only downside of this solution is it redirects only to github
and guesses urls, I quite like that the go toolchain is indifferent as to
where the sources are hosted. This would be very easy to add to the toolchain
though; all they'd need to do is recognise imports like this (personally I
prefer versions at the end of current urls, not in the middle):

    
    
       import "bitbucket.org/user/pkg/v1.1.2"
    

in go get and download that tag to that path, no other changes in the
toolchain would be required, and those who ignore versions could continue to
do so, while people who wanted them use the version specific imports. I think
the objection of the go team was that this doesn't solve any of the complex
dependency issues that plague package managers, but it'd be nice just to be
able to specify which version is expected for future users and have
predictable builds when sharing code outside one org.

~~~
burntsushi
I actually haven't even used gopkg.in myself. Gustavo Niemeyer just released
it and has been pushing it as one way to promote API stability. I agree with
him and plan to migrate to the scheme soon, as there are a few people using my
libraries.

With that said, I doubt very much it will make it into the official toolchain.
There's really no point if it can be offered just as well by a third party.
For example, godoc.org has proved invaluable but isn't official.

Also, keep in mind that gopkg.in is still pretty new, and I believe Gustavo
has said that adding support for other web sites or revision control systems
is perfectly doable. Just wanted to get something started for the lowest
common denominator, I suspect. :-)

~~~
grey-area
The advantage of having it in go get would be that no one has to rely on a
single point of failure like gopkg.in, nor would they have to get new urls
recognised there, and finally that it could become an accepted way to version
dependencies instead of the versionless future golang lives in right now. They
did add versioning to the lang after a while so hopefully this will be
similar. It's not a huge deal anyway right now, it'll become more important as
the pkg universe grows.

------
molecule
Yehuda's taking on another major software project before Rails.app [0] /
Tokaido [1] is released? It's been 22 months since it raised twice its initial
goal.

[0]
[https://www.kickstarter.com/projects/1397300529/railsapp](https://www.kickstarter.com/projects/1397300529/railsapp)

[1]
[https://github.com/tokaido/tokaidoapp](https://github.com/tokaido/tokaidoapp)

~~~
wycats
@molecule Fair question. Tokaido ended up taking far longer than I expected (a
big, huge mea culpa for sure). TL;DR: At long last, I plan to ship Tokaido,
along with a website, documentation and automation at RailsConf this year.
That wouldn't be the first time I said such a thing, but the feature set is
actually done (and working on many test machines).

I dedicated a bunch of medium-time months at the beginning of the project to
getting the initial set of functionality done:

* Figuring out how to statically build Ruby, which has required non-trivial updates with every version of Ruby, and a bunch of work to get some of those fixes upstream. Many of the projects that bundle Sass or Compass in a pretty GUI are making use of this initial work.

* Building a number of OSS libraries ([https://github.com/tokaido](https://github.com/tokaido)) that enabled an application-isolated workflow like Pow with many fewer failure modes.

* Integration with popular tools that people use in development (redis-server and Postgres.app)

This process took longer than I expected (a year, rather than more like six
months), and at the end of it, I had run out of work-time time to devote to
the project. I turned my attention to nurturing a small community of people
who could make Tokaido an open-source project that could be community
maintained once we ship.

Andrés Robalino, in particular, helped me squash a number of important bugs
that I was having trouble with (including a bug that was triggering occasional
100% CPU usage on some machines), and has cleaned up the process of building
static Ruby and added a bunch of polish to the UI.

Tokaido has definitely not been my finest, quickest open-source turnaround,
and it's fair for people who don't like me to use it as an example of my
failures. That said, I think people who follow me know that I have committed
my heart and soul to many more projects than Tokaido, many of which have large
communities of people who love them.

Not everybody needs to love my projects or my style. I should have shipped
Tokaido earlier, and for that I am sorry. That said, I am committed to
finishing the job with Tokaido, and believe that the other major projects I
have worked on speak for themselves.

~~~
molecule
Thanks for the reply and thorough update, that sounds awesome, and,
especially: thanks for Bundler-- it makes so many developers' lives much
easier!

------
defen
I just hope they don't repeat all the bundler-related things that made me hate
deploying rails.

~~~
steveklabnik
Could you expand on that, please?

I am very emphatically _not_ saying that you're wrong, but without enumerating
what your problems are, you can't get them solved.

~~~
defen
It's been a while (almost 2 years) and I don't use ruby much any more, so my
memory is a little hazy and for all I know the issues have been fixed. My main
complaints were that too much magic made it difficult to tell what was being
loaded from where; and that it was a ton of work to get proper automated
deployments set up (with puppet...in a way that didn't make assumptions about
the target machine - RVM was also a huge culprit here). Also, at times it was
painfully slow. To be fair I'm not super familiar with the internals - it's
totally possible that it's all ruby's fault and any ruby package manager would
have the same issues. For every day development bundler was mostly great - it
was just painful to get proper deployments working.

I much prefer how npm handles things.

~~~
steveklabnik
Things have changed quite a bit in the last two years. Especially around
speed. So things are a lot better now in Bundler land, and should be better
with Cargo, too. Cool.

> I much prefer how npm handles things.

Can you elaborate on how npm handles things in a better way?

~~~
TheHydroImpulse
For one, I love local by default. Having a package manager that deals with a
global context makes things considerably more complex. Having everything local
allows you to isolate one crate's (if we're speaking in Rust terms)
dependencies to another crates' dependencies.

NPM also has the ability to have isolated dependencies from each other
dependency. For example, module A can pin module B to version 2, whereas
module C can pin module B to version 3. So you have independent copies.

Now, that worked for a dynamic language where you don't have to deal with
static/dynamic linking. Would it be appropriate to statically link two modules
that are the same, but at different versions? Maybe not. That would lead to
massive binary sizes.

~~~
steveklabnik
Thank you.

> I love local by default.

Roger. It's hard to articulate my own thoughts about this, but I feel like
Bundler enables the best of both worlds here.[1] That said, it's obviously a
preference.

> NPM also has the ability to have isolated dependencies from each other
> dependency.

Ahh yes, I've heard about this. In Ruby, it's not really feasible due to the
language, and I'm a bit skeptical that it doesn't lead to super extra
complexity, but then again, I haven't used it myself...

Totally agree regarding static/dynamic linking. Though isn't it the same thing
with NPM: you still have two copies of the library.

1: [http://words.steveklabnik.com/how-to-not-rely-on-
rubygemsorg...](http://words.steveklabnik.com/how-to-not-rely-on-rubygemsorg-
for-deployment)

~~~
jdlshore
> I'm a bit skeptical that [isolated dependencies] doesn't lead to super extra
> complexity...

It's braindead simple. It's an emergent property of the way node loads
modules, which is also pretty darn simple.

When you give Node a package to require, if it doesn't refer to a relative or
absolute path, it walks up the directory structure looking for `node_modules`.
When it finds one, it looks inside for the module you asked for. If it doesn't
find it, it keeps walking up the directory tree.

So you can have multiple module repositories in a directory tree. A given file
will always find the one that's "closest."

Npm takes advantage of this by installing every module's dependencies in a
`node_modules` directory in the root directory of that module.
Subdependencies' dependencies go in _their_ node_modules directory. Voila!
Isolated dependencies, braindead simple, albeit with a crazy deep directory
structure. (node_modules/foo/node_modules/bar/node_modules/baz...)

Node/npm has the best dependency handling I've yet seen, and I think it's
because it doesn't try to outsmart me or go for anachronistic disk space
optimizations.

~~~
mercurial
I'm not sure what's anachronistic about efficient use of disk space, but in
any case that's nothing that can't be solved via hardlink deduplication.

~~~
jdlshore
> I'm not sure what's anachronistic about efficient use of disk space

Consider this: A frequently used module in one of my trees is "glob". It's
repeated four times in two different versions. It takes 207K each time,
including subdependencies. The wasted space is 414K... more than a whole
floppy! ;-)

Or to put it another way, I've wasted 0.00017% of my rather small 250GB SSD.

Optimizing _that_ is anachronistic.

> that's nothing that can't be solved via hardlink deduplication

Yep, and `npm dedupe` [1] does something similar. This _does_ have the
potential to become massively complex, though. You have to re-dupe when one
dependency upgrades but another doesn't, and you also need to deal with the
fact that two modules' dependencies may be sharing memory space that a module
author was expecting to have to herself. (Modules are cached, so changes to
"module global" variables are shared, but modules loaded from different
locations are cached independently.)

[1] [https://www.npmjs.org/doc/cli/npm-
dedupe.html](https://www.npmjs.org/doc/cli/npm-dedupe.html)

~~~
mercurial
> Optimizing that is anachronistic.

On the other hand, my Java .m2 directory counts 2469 jar files for a total of
2.4G. My considerably smaller .cabal is still 253MB. I won't complain if
hardlinking prevents package repo sizes from going out of hands.

> Yep, and `npm dedupe` [1] does something similar.

Not quite. From what I understand, it attempts to manipulate the package
hierarchy. On the other hand, hard link deduplication doesn't need to do any
such thing, it simply needs to hardlink files with the same checksum. No
intelligence required. Some time ago, I used rsync to do this kind of thing
when deploying large binary dependencies, and it is very practical (IMHO, hard
links get too little love).

~~~
jdlshore
Fair enough.

------
copx
I am very skeptical about this. Every language which needs a "package manager"
tends to make building standalone applications painful.

I do think C++ is in dire need of being replaced, it is unsafe at any speed
and full of legacy cruft. Unfortunately D jumped on the GC train and had other
serious issues as well, allowing C++ to survive that attempt without a
scratch.

Rust finally did the right thing, aiming for the same zero overhead/you only
pay for what you use design which made C++ such a success.

However, C++ is a platform agnostic language, while Rust thus far has only
supported Linux as a first class platform. I find the attitude of the Rust
devs ("we will improve Windows support once the language is stable") deeply
misguided. It meant that they largely missed out on feedback from Windows
developers during the development of the language. "Windows developers" also
means all the AAA PC(+console) game developers (the most diehard C++ users).
Linux still is not a relevant platform there.

And the Rust developers seem to continue to go down that rabbit hole by
enlisting Ruby developers to develop a "package manager". As a Windows guy the
very word makes me cringe. How is that thing going to integrate with Visual
Studio and other Windows specific concerns? Ruby only really works on Linux,
first advice you get as a Windows guy wanting to learn Ruby is "Install Linux,
if you try to do Ruby development on Windows you are in for a world of pain".
I do not trust any Ruby developer to write portable software, they are married
to the GNU/Linux ecosystem. Would you expect Microsoft guys to develop
something which actually works well on Linux?

Also remember that C++ does not have a "package manager". Some of the most
complex and massive applications in the world are written in C++, yet you do
not hear many C++ developers crying "When will we finally get a package
manager?". It is not even on the agenda. C does not have one either.

"Package managers" are a Linux-ism, they deliver a certain UX you may or may
not like (personally I hate it with passion), but they should not be part of a
platform agnostic programming language. A package manager may belong to a
Linux development environment for Rust, but the language itself and its
library handling should be completely independent of it. Rust libraries should
work just like C++ libraries so that they do integrate well with other
development environments.

I see Rust becoming a new OCaml, utterly Linux-centric and thus leaving C++ as
the sole competitor in the maximal performance + high-level abstractions +
platform agnostic category.

For the sake of games no longer crashing randomly because of memory corruption
bugs: change course _now_.

~~~
kibwen
You're jumping to conclusions needlessly.

Windows _is_ a first-class platform for Rust. Windows is Firefox's most
important platform, and Servo needs to demonstrate that opportunities for
optimization that could be relevant to Gecko are applicable to browsers
running on the Windows platform.

If Rust seems like it's less well-integrated into Windows than it is into
Linux and OSX, it's because none of the Windows developers who keep
complaining about Windows support seem to be willing to step up to the plate.
I've made this same offer several times to self-proclaimed Windows devs on HN:
if you want Rust to get better on Windows sooner, make it happen sooner. Until
then, the Rust devteam will spend their resources finalizing the language
itself, while the Linux and OSX and FreeBSD and Android/ARM contributors
continue to provide better integration with their chosen platforms.

This is not hostility. _Please help us._ We know that Windows is important. We
need experts.

~~~
ternaryoperator
I admire what you're doing, but I find the assertion that Windows is a first-
class Rust platform difficult to square with your repeated requests for more
Windows volunteers to make that happen. Windows developers are everywhere. Why
doesn't Mozilla hire some Windows experts for the Rust team?

~~~
pavlov
Agreed. The message from Mozilla is that "Windows is really important to us,
please come do it". Meanwhile they're paying for the development of yet
another Unix-style package manager.

Clearly there is a list of priorities that guide these investments, and
Windows support doesn't rank very high on that list.

~~~
kibwen

      > Meanwhile they're paying for the development of yet 
      > another Unix-style package manager.
    

A package manager that will be platform-independent, and thus benefit Windows
devs as well. At Rust's current state it's hardly surprising that they've
chosen to allocate resources to efforts that will benefit all platforms
equally. Meanwhile, the community for every platform but Windows is chomping
at the bit to submit improvements. No doubt they're still holding out for the
Windows community to do the same, without having to resort to hiring a dev
specifically for Windows. But it will happen, if it has to.

In the meantime, if you know any experienced Windows devs looking for jobs
working on Rust, please point them towards [http://careers.mozilla.org/en-
US/position/o3VZWfwD](http://careers.mozilla.org/en-US/position/o3VZWfwD)

------
teacup50
A suggestion: don't build an OS package manager. Build a project dependency
manager.

~~~
steveklabnik
That's exactly what Bundler is, and what cargo will be too.

------
chrismorgan
Initial announcement (three quarters of an hour earlier) from Brian Anderson,
project leader: [https://mail.mozilla.org/pipermail/rust-
dev/2014-March/00908...](https://mail.mozilla.org/pipermail/rust-
dev/2014-March/009087.html)

------
dubcanada
I'm all for Rust package system. But Rust isn't even stable or sort of stable
yet? What is the point of working on a package manager when they don't even
know what vectors or the extern crate/mod system will look like in 3 months?

~~~
wycats
A package manager will be a forcing function for stability. The Rust team
wants to ship Rust 1.0 sometime late this year (I think? Correct me if I'm
wrong)

Building a package manager is not a one-month project, and getting it done in
parallel with the stabilization of the core means that there will be a full-
stack system that people can use and help iterate on as things stabilize, and
that will be ready to go once Rust itself is ready for mass consumption.

~~~
steveklabnik
Yes, the intention is 1.0 this year.

------
gkya
Package managers make everything way more complex than they are supposed to
be. I see package managers as a component of an operating system. The proper
way of packaging software source is having it contained in a directory tree
with one, _public_ build script (a Makefile, a shell script...) at its root.
One or many build artefacts are generated, which at user's disposal.

With language specific build systems, this process gets more complicated. They
disallow the user to arrange their source tree the way they want, customise
the build process, and make the whole thing as convenient as running _make_ ,
or e.g. _. /build.sh_. Provided a conventional compiler command (e.g. the cc
interface), it is easy to create a Makefile that exploits it.

I have had a lot of confusion with Go compiler, when I tried to build and use
a checked-in, external package. Python's pip is quite complicated. Cabal, can
easily be replaced with a bunch of Makefiles. Binary packages can be supplied
as [tar/zip] archives, and users would eventually package them for their OS.
Also, most these package managers are exploited for installing applications,
which is problematic.

I have not used Rust, but I will try it out. The package manager, though, is
not just a bad idea, but also an inconvenience.

------
davidgerard
Every application expands until it contains a sketchy rewrite of apt-get.

I BEG YOU as a sysadmin: make sure this stuff is compatible with Debian
packaging. Please. Please.

~~~
byroot
I don't know much about Rust, but AFAIK it's mostly static linking like Go. So
2 Rust applications won't have to share anything, so developments packages do
not have to be converted into deb packages. Unlike ruby or python packages.

~~~
mcguire
" _I don 't know much about Rust, but AFAIK it's mostly static linking like
Go._"

Afraid not, at least by default.

~~~
azth
It is:
[https://news.ycombinator.com/item?id=7419980](https://news.ycombinator.com/item?id=7419980)

~~~
mcguire
Weird.

I have one program here using libstd-3e5aeb83-0.9.so,
libgreen-83b1c0e5-0.9.so, librustuv-2ba3695a-0.9.so,
libcombinations-6b2260d9-1.0.so, and libbisect-441e5ec3-1.0.so (the last two
are local libraries). And I have another that _doesn 't_, according to ldd.
Both compiled with the same flags.

------
vlucas
So does this mean we're going to have to type "cargo exec" before each and
every useful command now?

~~~
carllerche
a) no to cargo exec b) If you have a better solution for bundler, please open
an issue and suggest it. c) There are few simple solutions to avoid `bundle
exec`, which is mostly there to make things easier when getting started. I'm
sure you spent a few moments to get to know your tools, so you must already be
aware of these.

------
gleenn
Good thing Yehuda has his phone number there so people can make sure to
contact him directly with their input.

I am very excited though honestly, I think he did amazing work with Bundler
and is quite smart so I'm sure we'll get something quite useable.

~~~
wycats
I have included my phone number on virtually every piece of public email (to
mailing lists) I have ever written. I have found that people do not abuse it.

~~~
gleenn
That's pretty amazing about the phone number honestly. I meant it only in
jest. Having sat in a room with you at Pivotal and hear you talk about your
work on Bundler was very cool and I appreciate your work. I know you take
feedback seriously so I guess its not that surprising.

------
moron4hire
For Rust, this is good. I'm sure this is very good for Rust.

Take the rest of my message with a grain of salt. It's 6:30am and I'm working
through my first cup of coffee still. I'm not intentionally trying to be a
grumpy old man.

For the rest of us, I'm concerned that yet-another-package-manager (YAPM? Yap-
meager?) will just continue to fracture the library ecosystem.

Why can't something like APT handle it? NPM doesn't work the same as PIP,
doesn't work the same as Nuget, doesn't work the same as Gem, other than the
most basic install functionality. Packaging libraries for distribution is
different for each, and if you have a problem, tearing into the system to
figure out where the failure occurred is different for each.

Maybe it's because, through hard-fought experience, I've finally learned how
to manage .NET dependencies without too much headache. Don't ever even think
of trying to use the GAC. Don't let your developers install the dependencies
on their own. Just make a directory full of DLLs in your project root and use
relative paths to load them. It's the only way I've been able to get
developers up and running with a project in Visual Studio as soon as they
clone the repository. Even Nuget gave me issues (though granted, I gave up on
it so fast I don't remember what they were, other than telling the jr. dev to
just dump the DLL in the libs directory already and get to work on the issue
list).

Attempting to isolate your library ecosystem from the file system feels like
it encourages "reinventing the wheel" at the language level for libraries that
will mostly be the same across platforms. Do we really need to figure out how
to do database connections in yet another language? Why are there 15 different
syntaxes for positional arguments in strings?

I'm just getting a little... weary... of starting to learn a new programming
and spending the next few hours just figuring out that they've renamed printf
to writeln and %s to {0} for no good reason, or that there are no database
connectors yet, or if there are they only implement a strange subset of
databases. It almost feels like language devs have gone out of their way to be
superficially different from everyone else, without being substantially
different.

Given that most languages have support for some kind of foreign function
interface, especially with the C ABI, it seems like we have the tools
available to us to start building cross-platform libraries and distribute them
regardless of consuming language.

I've been wanting to dabble in Rust for a little while, but damn, yet another
package manager to learn, yet another notion of what a library ecosystem
should look like, just doesn't excite me right now. It's time I will have to
spend to get in the door, and I am not even sure right now if I will want to
stick around.

But I suppose with an initiative like Rust, isolation is probably the correct
ideology, considering it's about security/performance before productivity.

Anyway, complaining done. No hate for Rust-team's work. Just kind of yearning
for a probably unobtainable utopian future :/

~~~
kibwen
In the end, Rust will always just produce the same types of binary artifacts
that C++ does. Currently people use Make and friends to build Rust, and that
will always remain possible. Cargo is just an attempt to simplify the
versioning, updating, and dependency-resolution stories in a platform-
independent way.

------
dllthomas
What edge does Cargo have on Nix?

