I've had a good experience with pip-tools (https://github.com/jazzband/pip-tools/) which takes a requirements.in with loosely-pinned dependencies and writes your requirements.txt with the exact versions including transitive dependencies.
Same here, in my team we had immediate dependencies defined in setup.cfg when PR was merged, a pip-compile was run and generated requirements.txt and store it in central database (in our case it was consul because that was easiest to get without involving ops).
pip-sync was then called to install it in given environment, any promotion from devint -> qa -> staging -> prod, was just copying the requirements.txt from environment earlier and calling pip-sync.
Take my upvote. This has helped us a ton. So nice that it resolves dependencies. Only issue we're running into is that we don't use it to manage our dependencies for our internal packages (only using it at the application level). I've been advocating we change so that we simply read in the generated requirements.txt/requirements-dev.txt in setup.py
Late to the party but `pip-tools` also has a flag for its `pip-compile` flag: `--generate-hashes`. It generates SHA256 hashes that `pip install` checks.