I wonder if it'd be too crazy to add git-awareness to the Nix utils. Like tell nix to install version X of some package, so in the nixpkgs repo it would check the history of the corresponding file to find the commit where it last was that version, checkout that commit, build the package and its dependencies, and then return to the branch it was at.
In theory, that database could also just be included in the channel so it could be queried locally, but I don't know how large it would end up being.
Also, to maintain package building scripts in a database outside of git would require either making 1 script per package that builds every version of that package ever. That sounds inconceivably hard. Another option is to have separate scripts per version, which sounds redundant. Another option is for newer scripts to import code from previous versions' and override parts, which sounds convoluted. If you use git, the script only has to worry about the current version. Not forgetting how to build previous versions would be automatically handled by git.
Re-reading your comment thought, I think you're talking about pre-built packages. If that's the case, then I think I agree. I'm talking about the repo containing the scripts for building such packages.
> If you maintain such a "database", you'd have to explicitly state version info for the dependencies of every package.
I don't know what you mean. If I say "I want git v2.10.0" I don't care about dependencies; whatever nixpkgs tarball I download that has that will have the dependencies too. There will be a whole range of nixpkgs tarballs that contain the requested git version of course, but with such a database I could also say "find me a channel tarball that contains both git v2.10.0 and some-dependency v7.23" if I want that particular combination (assuming the two coexisted in the same nixpkgs at some point).
> For a particular version of a package, the versions of the dependencies it's compatible with are those that are present in the same commits.
This is true of downloading a nixpkgs channel tarball too.
> Also, to maintain package building scripts in a database outside of git would require either making 1 script per package that builds every version of that package ever.
I don't know what you mean by this either.
> Re-reading your comment thought, I think you're talking about pre-built packages.
No I'm not.
What I'm talking about is just any time hydra builds a channel, it can run a script that collects the pname/version combination for every derivation in nixpkgs (or at least, every searchable derivation; we probably don't need to collect this from all the invisible helper derivations). This can then be appended to a database from the previous version, such that we end up with this information for every single tarball.
In fact, this is pretty much exactly what you'd do for git, except you'd build it incrementally rather than running a script that processes the entire git history from scratch (which is pretty much building it incrementally for every commit).
> The default way to consume nixpkgs is through channels which today does not involve git. And a nixpkgs git checkout is over 1GB (my .git dir is currently sitting at 1.4GB on this machine). So that's not great.
One could also add tags to the repo in the form of pname-version for every package. I wonder how well git can handle that many tags...
In any case, the advantage of being able to do this from the git repo is that you wouldn't depend on someone forever hosting every version of a channel. I would think one would discard old channels before they discard git history.
I would prefer doing it based on the 6-month release channels, so you get multiple version for every 6 months. You end up with some gaps between versions, but also have more guarantees everything actually works together. Basically "nix search" with multiple channels.
I actually had to do something similar with GHC versions for a project of mine. It turns out you can run Nixpkgs all the way back to the first release in 13.10 (LC_ALL=C is needed). Obviously not that long ago right now, but it should continue to work as time goes on & give us 10+ years.
We do something similar (and the way more manual) here.
It keeps makes past versions of packages available to you, but does not really help with discoverability.
What you suggest in the second paragraph sounds great and definitely doable!
You can, on Windows. The secret is the system keeping backwards compatibility in mind for its core APIs, something that most desktop APIs on Unix (outside of X11) do not do.
(now the asterisk: in terms of functionality above and taking an average desktop distribution in mind you'd probably find more of it in Linux, but applications can't rely on most of that functionality being available on all distros as a "standard" and even for the stuff they could expect, way more often than not they rely on specific version ranges - e.g. an application cannot rely on "Gtk", it can only rely on "Gtk <version>.xxx", as Gtk itself isn't backwards compatible across major versions).