Maybe I'm confused about what this offers, but I have been running private pypi repositories for a decade now, and it never required more than running an HTTP server with directory listing.
As for doing partial mirroring of pypi with only what you are using, is that really a good idea anyway? it will break whenever you add or change any dependency.
The problem isn't really on the serving side, it's on the mirroring side. Trying to mirror PyPI - at its current 13.4 TB size[1] - and bringing all those terabytes into a restricted network with security policies and no access to the internet, is impossible. Partial mirror is the only way to go for such a use case, and given that Morgan automatically resolves and mirrors dependencies, adding new dependency shouldn't break anything.
Can't you resolve the dependencies by running pip download when you have internet and later serving that directory with a local HTTP server as the parent suggested? Pip download will resolve all the dependencies for you already the same way as pip install would.
No, as I mention both in this post and in the README. Pip will download binary distributions (wheels) that were compiled for the system it is running on. If my mirror is meant to serve a different version of Python installed on a different OS with a different libc (or other such differences), then it won't work. I could try to match the target environment on the mirroring side, say with Docker, but this is either cumbersome or still not possible if you have legacy environments from years before.
Makes sense and you're right we did encounter issues when changing platforms at the time when we were using some self rolled janky versio n of this! Thanks
You can download as source packages instead of wheels but then you need to make sure you have all the requisite compilers and libraries needed. This isn't an issue for Python-only dependencies but can be difficult for dependencies with lots of native code like numpy/pandas where you need a C toolchain & Fortran toolchain installed (and possibly other libs)
If you're using something like Docker/containers, you can download the dependencies inside the container and be reasonably sure you get the right wheels. This becomes trickier when you have different setups like developers on Windows and production on Linux.
Came here to say this. I run private pypi repositories for this use case and it works fine. Ive had to thumbdrive over all of our dependancies from the wheels etc. A single bash script that runs all the checks and downloads and zips to the offline environment then use your pip install like normal with the login creds to your offline pypi registry.
As for doing partial mirroring of pypi with only what you are using, is that really a good idea anyway? it will break whenever you add or change any dependency.