What does this tool provide over an in-project .venv/ directory created by venv and populated by pip from an etc/pip/requirement.txt file containing all the packages by version?
There seems to be a lot of activity in this area for a seemingly solved problem. Having solved it, people appear to wish to expand the problem to something bigger (such as multiple interpreter versions, for example)
I believe the most-common case to be:
1 Choose python version and install on operating system., 2 create .venv/ directory. 3, install packages from package system (pip)., 4 freeze versions to local file. 5 use installed .venv/ and do work.
I've got some scripts I use for the above [1] -- must give it some TLC, and I don't understand why people furiously want more than this.
It is a solved problem for other languages. Pip has soooo many issues.
The biggest IMO is pip doesn't provide a way to pin indirect dependencies, so there is no good way to ensure that other people working on a project are also using the same dependency versions. You can do it with extra tooling of course - and that's what poetry and pipenv do.
Even worse-- there's no simple way for pip to print out the full dependency tree (including transitive dependencies). This makes it a nightmare when it complains about some transitive dependency version conflict and you have no idea what's causing it to be pulled in
What do you mean? If you run pip freeze inside a venv it will print every user-installed module and their dependencies. When installing them in a new venv as a requirements file, you end up with the exact same set of modules.
Agree. Works fine for me exactly as you describe. I don't understand why people furiously want more than this for the common case. (Choose python version, create .venv/ directory, install packages, use installed .venv/ and do work.)
Sometimes it works, sometimes it doesn't. This is indeed a problem, but one for which there isn't an elegant solution.
When multiple versions of the same library are allowed to coexist in different dependency trees there is less incentive to solve these conflicts and versions proliferate (see npm).
I'd much rather solve the occasional conflict (usually solved either by settling on some older version of both parent dependencies - and later advancing as possible - or just trying to fix it and send a patch upstream).
Issues start occurring when you, for example, delete dependencies because with pip freeze you cannot ensure that you have deleted their dependencies also. The most common solution to this (that poetry and pipenv use) is to provide a lockfile to track transitive dependencies and their versions, without that (except for manually curating your dependencies) you can't ensure that you get a reproducible environment.
OK, but it does manage the versions of transitive dependencies, and there's nothing in that process stopping deterministic builds.
Adding/removing top level dependencies over time does require the use of two files (the top level requirements and the frozen/locked requirements which lists everything). Or you can list the top level requirements in setup.py and let requirements.txt be the lockfile. It would be nice if pip managed this lockfile automatically, but I'm not really interested in adding any of these newer tools to my toolchain just to manage a lockfile.
There are many packaging and distribution frustrations in Python, I don't think pip's management of dependency lists is one of them
I just so wish that Python tooling would be better, in large part pip is at fault here. If there was no conda I probably wouldn't even touch Python as things look very bleak especially if you need to run anything across different operating systems.
The consensus among non-Python developers is that that is not good enough.
The problems are:
requirements.txt is way too flexible, so users want a lock file to freeze stuff in place.
You can use pip freeze to generate a sort-of-lockfile but it's more manual than tools like Cargo/npm/bundler which do it automatically. And more adhoc as to what you call it. Combined with needing to mange venvs and sourcing etc. people want a standard script that just figures out its context from the now standard "explicit requirements"/lockfile setup.
It doesn't handle versioning Python itself, which is expected of it for some reason, even though nobody cares that npm/Cargo/bundler don't version their languages.
I felt the same way about only needing requirements.txt until I managed a project that needed to be compatible across many versions of python and some of it's dependencies were renamed from one version to another. I highly recommend you take a look at the hypermodern python guide https://cjolowicz.github.io/posts/hypermodern-python-01-setu...
Pipenv and poetry help, but, even with them, it's still not good enough. setup.py is a real challenge. Packages being able to execute arbitrary code at install time can make it very difficult to get a truly reproducible python build. Details of how pip (which pipenv and poetry still use) handles packages means that doing anything that even hints of a monorepo is a delicate subject on the best of days.
And the solution to the multiple interpreter versions problem should be orthogonal to venv like pyenv [1] instead of overlapping with venv like pipenv.
There seems to be a lot of activity in this area for a seemingly solved problem. Having solved it, people appear to wish to expand the problem to something bigger (such as multiple interpreter versions, for example)
I believe the most-common case to be: 1 Choose python version and install on operating system., 2 create .venv/ directory. 3, install packages from package system (pip)., 4 freeze versions to local file. 5 use installed .venv/ and do work.
I've got some scripts I use for the above [1] -- must give it some TLC, and I don't understand why people furiously want more than this.
[1] https://github.com/gjvc/python-template-project