
Show HN: Update your requirements.txt file with Pur in Python - welder
https://github.com/alanhamlett/pip-update-requirements
======
tedmiston
Why would you want to update requirements.txt before testing against the new
package version first?

My dependency upgrade workflow for a Python web app is usually:

    
    
        run pip list --outdated
        upgrade one package
        run tests
            if success
                pip freeze to update this package (and its dependencies)
                commit
            else
                investigate or revert and make a note for later (if many breaking changes, etc.)
    

Perhaps this could be useful if you're using tox to test against many versions
of Python, e.g. for a library.

(Or if you only freeze top-level dependencies, even though that goes against
best practices.)

~~~
welder
Some of my projects have double digits dependencies in their requirements.txt
file so upgrading one at a time, even with pip list --outdated, would take too
long.

I prefer to upgrade everything and fix things that break. I've always been
able to easily figure out which dependency was causing the exception after
upgrading all packages, so no point in upgrading them individually for me.

Also, I don't manually update my packages. My build scripts do that for me
from my requirements.txt file, so it makes sense to update requirements.txt
first.

~~~
tedmiston
My main project has ~30. But still I prefer to go one at a time, especially
with a major version bump, for the sake of understanding exactly what breaks.
Perhaps it's just a personal preference.

~~~
welder
Yes perhaps :) I still look at the changelogs of all updated packages before
running the upgrades.

------
di
Seems similar to the features offered by pip-tools [1] and pip-review [2].

If you want to automate updating your requirements file, take a look at
requires.io [3].

[1] [https://github.com/nvie/pip-tools](https://github.com/nvie/pip-tools)

[2] [https://github.com/jgonggrijp/pip-
review](https://github.com/jgonggrijp/pip-review)

[3] [http://requires.io/](http://requires.io/)

~~~
welder
pip-tools and pip-review just look at your installed dependencies and upgrade
them, they don't update your requirements.txt file.

Upgrading packages doesn't work for me because I need my build scripts to do
the upgrade. For example, when installing some packages which have C bindings
they fail unless certain flags are set correctly.

Requires.io is cool, I personally use Gemnasium [1] for the same thing.
However, both tools don't update your requirements.txt file for you, they just
notify you when you need to update it.

[1] [https://gemnasium.com/features](https://gemnasium.com/features)

~~~
di
If you use requires.io with Github, it can optionally make a PR to update your
requirements.txt for you.

~~~
welder
Requires.io costs money to do a PR to private repos right?

------
nicois
My approach: keep my "raw" requirements. _\\.txt files in a ".requirements"
directory in the project root. The packages listed here mainly are bare,
without a version qualifier. I run the script at
[https://gist.github.com/nicois/b49deb1f92e9f504cd93715a78440...](https://gist.github.com/nicois/b49deb1f92e9f504cd93715a78440b9b)
This creates matching requirements._ files in the project root, with specific
python package versions for everything. Both sets of files are checked into
the repository. Periodically, I re-run the gist, which updates the
requirements in the repo.

IMO this is the best of both worlds: in the "bare" requirements I only put
what I want, and the derived requirements file is a fully-defined list of what
I need.

This means each commit in my repo is not susceptible to third-party changes:
when my tests pass on a given commit, it will pass every time in the future
too. I can decide when I want to refresh the requirements, and it's a simple
one-liner.

~~~
kennell
There is a popular tool for this approach:

[https://github.com/nvie/pip-tools](https://github.com/nvie/pip-tools)

------
michaelmcmillan
There is a free service that does this for you. It automatically sends a Pull-
Request whenever one of your dependencies are outdated. This means that you
can easily check that all your tests pass before merging it.

[https://doppins.com](https://doppins.com)

~~~
welder
That's a very new service and only for open-source projects on GitHub...
command line FTW and works for projects hosted anywhere.

~~~
michaelmcmillan
Yes, it is a new service, but I don't see how that holds up as general
argument towards not using it. Your second point is false, it supports private
repositories.

------
SilasX
Wait, wouldn't you get the same result from running -U on the contents of the
requirements file (after stripping the == out)? i.e.

    
    
        sed 's/==.*//g' requirements.txt \
        | tr '\n' ' ' \
        | (read var; pip install -U $var) \
        && pip freeze > requirements.txt

~~~
welder
What about comments, packages installed via git, recursive packages?

------
vanesa-
Finally! I've been needing this for a long time. Great that it doesn't force
package installs on you...

~~~
welder
Same here, needed this for a long time. I wasn't going to keep a duplicate
requirements-to-freeze.txt file around [1] so created this tool.

[1] [http://www.kennethreitz.org/essays/a-better-pip-
workflow](http://www.kennethreitz.org/essays/a-better-pip-workflow)

------
kennell
I love Python, but managing dependencies and dealing with pip really is a mess
compared to the Ruby gem tool/ecosystem.

Tools like pur, pip-compile etc. should have been included a long time ago.

------
Chris2048
Maybe use with
[https://news.ycombinator.com/item?id=8004479](https://news.ycombinator.com/item?id=8004479)
?

------
peterbe
hashin does something similar but supports cryptographic hashes that pip 8
understands.
[https://pypi.python.org/pypi/hashin](https://pypi.python.org/pypi/hashin)

