If you're interested in helping out, stop by #ipfs and #gx on freenode!
Please know though that things are still super early, and very rough! We do not think our pkg mgment efforts are UX ready for end users who want something strictly better today. But they are ready for early adopters who want to think or hack on them with us. In fact, we are now self-hosting go-ipfs in gx! https://github.com/ipfs/go-ipfs
Something to bear in mind: we are building infrastructure you can rely on long term, with decentralization, flexibility, and futureproofing from the ground up. We want our protocols and tools to survive independent of fragile organizations or fragile routing. We can't take many shortcuts (like depending on centralizing agents), because we want something rock solid to last for decades. This means it takes us a lot longer to build high perf and high quality UX, because there's a lot more to do. More work, but it is all achievable. The rough edges will disappear as we improve the tooling. Even today, gx is amazingly simple and powerful, and already does much for dependability.
It's worth mentioning other package management tooling we're working on too:
* NPM + IPFS - https://github.com/diasdavid/registry-mirror and other repos. This is an effort to improve how NPM works with IPFS
* pacman + ipfs - https://github.com/ipfs/notes/issues/84 - this is a super rough way to show how to add ipfs support to a package manager through FUSE. it's not the best way, but it works!
* also, we <3 nix, and cool stuff coming in the future there :)
We have LOTS in development and store for the package manager communities. It's a very exciting time. Please join us at https://github.com/ipfs/ipfs and #ipfs and #gx on freenode IRC. And, if anyone wants to work on "Hypermodular Programming" with us, see https://github.com/jbenet/random-ideas/issues/27 and ping me :)
Question about package managers built on ipfs, and ipfs in general: how do you ensure the packages/files you depend on will always be available? Is the idea that there would be at least one entity pledging to host every package for eternity, or if you depend on obscure packages would it be wise to run your own mirror of the packages you use? Or something else?
- storing is super cheap. seeding not v expensive. i think the entire npm reg is <1TB?
- (soon can ship entire registries in USB keys!)
- many individuals and orgs should keep/replicate what they depend on
- tools like ipfs-cluster will help organize nodes like a RAID array
- can pay services to back things up (on service infra in vogue)
- can pay _the network_ to back things up -- see http://filecoin.io
Ipfs pin add [key]
(1) Validates that the specific revision of a package you're dependent on hasn't changed (i.e. checksum of the package stays the same), and
(2) On upgrade, validates that ownership of the package hasn't changed (i.e. package is signed by the same person), or that any change in ownership is authorized/authenticated and traceable to the author of the previous revision. Some sort of strong notion of identity would be essential here, too-- maybe something along the lines of Keybase.
This looks like it does #1 (by virtue of being based on a content-addressable filesystem), but it doesn't look like it tries to handle #2... objects and repos just "are"; they don't seem to carry ownership information with them. If I want to validate that I am, in fact, getting Express.js from the Node.js Foundation and not some complete stranger, I'm forced to trust whatever website I pulled the package or repository hash from.
(This identity/authentication issue isn't new-- for example, Apt solves it with GPG signing of repositories, and assume that the repository is taking responsibility for the safety of the content they're publishing. It doesn't seem to be something that these newer (less strictly-managed) tools seem to be concerned about, though-- npm, bower, homebrew, etc.)
This is something that we should have all moved to years ago. Our drives are big enough for most shared libraries to move away from. Though I am intrested on how they handle something like KDE and KDE applications.
IPFS providing the ability for permanent, signed content makes this great.
I eventually just uploaded a different format then quickly pushed a new version, but I was kinda disturbed that I could irreversibly break a dependency like that.
This is a great idea, but there's no mention of versioning. As a long-time advocate of what we now refer to as Semantic Versioning, I can't help but see this as a disaster waiting to happen.
> Let's say we adopt this. Cut to some small number of years in the future, and everyone is complaining about how dealing with 46-character-long opaque strings is terrible.
Agreed. You can point gx to "repos" (/ipfs addresses that map (package, version) => /ipfs/... addresses. This lets you use simple names, and anyone can publish a repo.
> This is a great idea, but there's no mention of versioning. As a long-time advocate of what we now refer to as Semantic Versioning, I can't help but see this as a disaster waiting to happen.
Sorry, this is a documentation failure rather than a feature failure: gx has semver built in and adheres to it today.
/set_of_files -> /ipfs/[key_1]
Now, these files change because they were your GIT repo of all your projects. So what happens is that the key that represents your files changes continuously. That sucks. There's a solution. IPNS.
IPNS can point at a continuously changing set of directories, and always provide the pointer as an IPNS link. It's the mutable part of IPFS...
Now, as for the hashes... The Firefox plugin already supports looking up the DNS txt record of the IPNS key associated to your site. So, if you have a pretty DNS name, you still keep your pretty DNS name.
Now, for the versioning, that assumes that there was something prior, something now, and something that will be. That's not always the case. In that situation, a changelog could be applied as a form of a blockchain showing all deltas to the keys changed. But that can be implemented when it needs to be, and not as core functionality.
Though admittedly it needs work. If youre the video watching type, the stanford talk on that page is very good (i recommend setting the speed to 1.5 or so).
Beyond that, come check out our git repos:
ipfs whitepaper and general protocol discussion: https://github.com/ipfs/ipfs
go-ipfs codebase: https://github.com/ipfs/go-ipfs
And the FAQ: https://github.com/ipfs/faq
Package managers have a long history of being poorly implemented. Look at CPAN, RPM, Alien, Pacman, Nix, etc as examples of what was a good idea, but still has various flaws. We already have everything we need, it's just not all built into the same tool. Adding new technology to a new tool doesn't make a better tool, it just makes yet another incomplete tool.
If I were to grant your comment the same amount of charity as you have granted this project, I would summarize it as, "This project has problems like all the rest. If you don't make it better, it's lame."
Non-corporate tools, on the other hand, are typically made by people who need to do something and want to do it their own way. It's not that they couldn't make due with an existing tool - the functionality probably existed in another tool or tools - they just want to do it differently, with some kind of interesting twist. The end result? Reinventing the wheel with maybe one new feature, and missing all the ones they didn't need/want. Do this a hundred times and you have a hundred tools that are useful to some people, but not most people.
I'm not saying it's lame, i'm saying it's a waste of engineering effort.