The gemfile.lock (and I assume the node equivalent) are reactive things in the sense that they're generated for the final configuration (much like pip freeze).
I'm not really against of tracking dependencies recursively, but not for the final purpose of building a grand requirements file (which people will end up using and tweaking). I'm more interested if there are conflicts between different 3rd party libs (e.g. if one needs package==1.0.0 and the other needs package=1.2.0) and if they're not necessarily backwards compatible (e.g. v1.4 vs v1.6 of django).
Now, if you want to change the version of a 3rd party which, in turn, has altered dependencies, you'd need to clean up the venv anyway to be on the safe side. Currently I end up just rebuilding the venv and performing a pip install. On distribution, you could build an equivalent of a gemfile.lock via pip freeze (again). Hence, no real benefit of a pip-compile. Still, a pip diff (e.g. between 2 requirements.txt, tracking recursively dependencies) would be of more value imho, particularly if the devs would agree on a pattern/best practice.
There's also the matter of adding yet another file (requirements.in) to the setup framework (setup.py, metafiles...) and somewhat changing the meaning of requirements.txt.
Okay, that's a fair point. It just seems like Ruby and Node have sucessfully adopted the use of two separate files for dependency tracking, so it seems like it would work with Python as well.
I guess what you're suggesting is that requirements.txt should be the equivalent of a Gemfile and there should be a separate "requirements.lock" (or whatever) which tracks the output of pip freeze.
On top of that, I understand that you're suggesting that the generation of this lock file should be part of the process of setting up a virtualenv as opposed to a separate tool such as pip-compile.
If my interpretation is correct, then I totally agree. Although I still think pip-compile may be useful until such a tool exists :)
I'm not really against of tracking dependencies recursively, but not for the final purpose of building a grand requirements file (which people will end up using and tweaking). I'm more interested if there are conflicts between different 3rd party libs (e.g. if one needs package==1.0.0 and the other needs package=1.2.0) and if they're not necessarily backwards compatible (e.g. v1.4 vs v1.6 of django).
Now, if you want to change the version of a 3rd party which, in turn, has altered dependencies, you'd need to clean up the venv anyway to be on the safe side. Currently I end up just rebuilding the venv and performing a pip install. On distribution, you could build an equivalent of a gemfile.lock via pip freeze (again). Hence, no real benefit of a pip-compile. Still, a pip diff (e.g. between 2 requirements.txt, tracking recursively dependencies) would be of more value imho, particularly if the devs would agree on a pattern/best practice.
There's also the matter of adding yet another file (requirements.in) to the setup framework (setup.py, metafiles...) and somewhat changing the meaning of requirements.txt.