Hacker News new | comments | ask | show | jobs | submit login

This is the first time I've seen a nicely documented requirements.txt and I like! https://github.com/lyft/confidant/blob/master/requirements.t...



No more "pip freeze | sort -f > requirements.txt", explicit dependencies are way more maintainable!


It looks like most of that could be generated. Now I want a tool that looks up that info and outputs annotated requirements file.


Shameless plug, I wrote this small being-bored-at-a-conference tool to show you the licenses of packages installed in your Python environment: https://pypi.python.org/pypi/license-info/0.8.7

It's pretty rough but gets the job done for separating free and open source requirements.


And then writes the code, including tests to show that all the requirements are met.


Great observation. One of best docs on dependencies I've seen.


+ for readability, - for precise versions. Good luck receiving critical updates this way...


We watch CVEs and update accordingly. Assuming you're using the latest stable release of the docker image or are using the latest stable release of Confidant (we're still working on making releases through github) you should be using a version with secure dependencies.

Using >= doesn't ensure security, but it does ensure less stability, and part of security is availability.


Pinning precise versions is the best practice for requirements.txt (as opposed to setup.py in a package) (http://nvie.com/posts/pin-your-packages/). Since we have no guarantee that every dependency uses e.g., semantic version, it's the safest way to have reproducible builds across machines.

You can also now list outdated packages with pip if you wanted to upgrade them yourself or test compatibility of new versions. Example for a side project I have laying around:

  $ pip list --outdated
  Django (Current: 1.8.5 Latest: 1.8.6)

  $ pip install --upgrade django
  ...


From a packager's point of view: enforcing specific version results in me having to patch and repackage the service for deployment just because one of its dependencies changed. I understand that keeping track of which project uses what kind of versioning strategy (to apply proper range) is hard - but it's either that or tracking every single point release they do.

I fully agree that deployment should be reproducible and stick to tested versions. But requirements.txt is how you build the software, not how you deploy it. (Unless it is, but then no sympathy from me ;) )


> + for readability, - for precise versions. Good luck receiving critical updates this way...

Real lesson. Authentication failed and no clue why app couldn't start. Developer spent almost two days and realized a new gem version had been released and the new version was not compatible with the current release. Quote developer "fool me once shame on you, fool me twice shame on me."

Imagine auto-scale (no image) or installing packages during docker container initialization, in production, the word "fuck" will fill up your mailbox/IRC/Slack/HipChat/ChatBot from your dear SRE / DevOps because of lack of regression testing.

Same argument goes for auto system update in Ubuntu. To receive critical update you better have a process to review, and warn critical updates. One change in the system API can screw up your entire system.

If you want the fail hard model, which is really useful, the best way is to have two sets of requirements.txt. One for random Saturday testing (just launch a docker instance running tests with latest version of packages installed), and one for development all the way to production.


But any critical updates should be API compatible and promoted ASAP. Sure, they need to be verified, but if someone breaks their API with a critical update, that's a separate issue with their release model. If that happens, system packages sometimes fix such incompatibilities (you'll get a new package with patched version instead of a completely new release)


From some brief exposure to Plone (and through Plone, Zope) -- I think basing your deployment strategy on "they should" is a bad idea.

Zope originated much of python packaging, through necessity (said tools/tooling has since continued to evolve). Plone still (AFAIK) have massive lists of "known good sets" of dependencies. Often down to the minor version. Because sometimes, what is a bugfix for most consumers of a library trigger latent bugs in other (sets of) libraries.

Yes, you want to fix all the bugs. But sometimes you can't - and then you might need to be explicit in upgrading from 3.2.1 to 3.2.1-bugfix9 rather than making the "leap" to 3.2.2.


> Sure, they need to be verified, but if someone breaks their API with a critical update, that's a separate issue with their release model.

Actually, it goes both way, and if your production is down, it is your problem. You can't expect everyone to be nice to you with guarantee. It is a shared responsibility.


i mean, aren't specified versions crucial for compatibility? what am i missing?

on second thought, i guess you are suggesting to specify the major, with an ambiguous minor? but doesn't this require that no package will introduce a breaking change in a minor version? not sure you can rely on that assumption.


Yes, and yes. It's the best practice to have flexible subversions of dependencies (or dependencies of dependencies) when you're shipping a package with setup.py (http://python-packaging-user-guide.readthedocs.org/en/latest...). However in requirements.txt for an app that runs by itself, their approach is the best practice.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: