
Pipfile for Python - shakna
https://github.com/pypa/pipfile
======
yladiz
This is great, and a definite step up from the current requirements.txt. Why
are the requirements like function calls though? Why can't they just keep a
simple text format that's more human readable and editable, rather than
something that just looks like a bunch of code? I think it's better in Ruby
because you can omit parentheses and it looks cleaner, but my preference above
both is package.json. Cargo is also really good. So in my eyes, moving to
something like this is somewhat of a step backwards from requirements.txt, but
taken together it looks like it will be better overall.

There are arguments for and against, but in general I think a configuration
file should be human readable and editable, as well as easily understood by an
IDE without having to run an interpreter. So something like YAML or TOML, or
even a simple INI would be better than the function calls in the Pipfile.
However, the lock file isn't meant to be edited by hand or really ever looked
at, so it being in JSON or something less human readable is fine.

Also, why doesn't pip just make it, by default, look for the requirements.txt
or the Pipfile? It's silly having to type _pip install -r requirements.txt_
and I will also find it silly having to type _pip -p_. It's how NPM, Bundler,
and many other package managers work; why do I need to call a flag to install
requirements?

~~~
bjt
The function calls in the Pipfile make me suspect it's going to be read and
eval-ed within Python. And that makes me shudder.

Many packages have abused the executable nature of setup.py files by importing
obscure packages or adding otherwise fragile logic. I would hate to see
Pipfiles go the same way.

Strong +1 to making it a declarative format like YAML or TOML instead of
something executable.

~~~
xapata
> eval ... makes me shudder

Why do you prefer using ``pip`` despite ``pip install`` using the package's
``setup.py`` (and therefore is an eval)?

Further, if you're using a package manager, I expect you won't be combing
through the source to check for malicious code anyway.

~~~
quodlibetor
For the non-install case, like when you're running a package index, having to
eval the dependency specifying is horrible.

This is also why wheels (the new python package format) use a static file
instead of setup.py. The Python ecosystem has been trying to get off of "just
eval setup.py" for years.

~~~
xapata
I'm not sure why avoiding arbitrary code execution during install is important
when you're going to do arbitrary code execution shortly after. What else is
the purpose of installing a package?

~~~
theptip
> For the non-install case, like when you're running a package index

You have just ignored the point that your parent point was making. It's not
the install case that they were complaining about.

~~~
xapata
True. It's surprising to optimize for such a rare kind of user.

------
holografix
Can't we just leave as it is? I love the fact that requirements.txt is stupid
simple. No over engineered json shenanigans. Wanna group things into prod, dev
etc? Create 2 files.

~~~
urda
Agreed, I'll take a pass on this project and stick with the accepted standards
which are exactly that.

Accepted standards.

~~~
djm_
pypa, the Python Packaging Authority, is a working group that maintains many
of the Python packaging projects (e.g pip) and therefore pipfile in time will
likely be the new accepted standard.

------
fphilipe
This is great news. Coming from Ruby and being used to Bundler, doing anything
in Python or JS always was a huge pain. Countless times I deleted the current
virtual environment or did an `rm -rf node_modules` to start fresh. So I'm
excited to see Yarn for JS show up and now this.

The main problem with requirements.txt, as I see it, is that you don't get
exact versions unless you specify it in your requirements.txt. So you'd have
to have a loose requirements.txt and then generate a second requirements file
after having done `pip install -r requirements.txt` to get the exact versions
that were installed.

Further, if you happen to "accidentally" `pip install some-package` in your
virtual environment, your app might now be using different packages locally
without you noticing. With Pipfile the need for virtual environments is pretty
much gone, assuming that at runtime it will automatically load the version of
a package specified in the lockfile, which is not clear to me yet from the
README.

~~~
genofon
maybe I misunderstood your comment, but it's pretty straightforward to get the
exact version, if you do

    
    
      pip freeze > requirements.txt
    

it will have all the files' versions specified

~~~
fphilipe
That's what I meant by

 _generate a second requirements file after having done `pip install -r
requirements.txt` to get the exact versions that were installed_

So you'd need to have a `requirements.txt` with loose versions suitable for
upgrading your apps deps, run `pip install -r requirements.txt` and then `pip
freeze > requirements.locked.txt`. Then everyone should be using `pip install
-r requirements.locked.txt` as well as during your build. But that's
cumbersome and error prone and doesn't free you from having the wrong version
of a dep in case you `pip install some-package` later on.

~~~
corney91
I'm not sure why you'd need two requirements.txt. You'd normally create a
virtualenv, pip install what you need, then lock the versions with "pip freeze
> requirements.txt". You don't need an initial requirements.txt to install new
packages.

~~~
jurip
One common reason is avoiding defining hard dependencies to versions of your
transitive dependencies. In my current Django project I have 19 declared
dependencies and 26 transitive dependencies. We have one file for the declared
one and then another we generate with pip freeze. This way the transitive
dependencies can evolve on their own without us having to keep track of them.

Pipfile looks like a definite improvement over the pip install, pip freeze
workflow.

~~~
Senji
>This way the transitive dependencies can evolve on their own without us
having to keep track of them.

This sounds exactly the opposite of what I'd want. I don't want some one to
slip in a Guy Fiery into my dependence chain without me noticing.

~~~
dasil003
You're misunderstanding. The whole point is that nothing slips in, but at the
same time, you don't have to force a specific version of something in order to
achieve that. The killed feature of Bundler for long term maintenance is the
ability to upgrade a _single_ requirement in a minimal fashion.

So you start with a Gemfile that is your minimum requirements with no versions
specified, the first time you `bundle install` it generates a Gemfile.lock
which is then sticky. Over time your requirements are completely frozen until
you decide to update, which you can do piecemeal via `bundle update gem1 gem2
etc...`. If you have a reason to avoid a newer library, then put a soft
restriction in the Gemfile, preferably with a comment as to why that
restriction is there and you have a very powerful long-term system for
managing versions over time.

Just freezing and forgetting is a recipe for disaster when you have to update
months or years later, and the transitive dependency updates are overwhelming
and conflicted. Similarly exact versions specified make it fiddly to upgrade
and hard to tell if there were reasons behind specific versions.

------
examancer
This is almost identical to how bundler in Ruby works, right down to the
language native dependency DSL, named groups, file name conventions (Pipfile =
Gemfile, Pipfile.lock = Gemfile.lock), and deterministic builds.

It's identical because bundler mostly got it right and dependency management
in Ruby, while still not great/perfet, is better than just about everywhere
else.

Kudos to python for moving forward.

~~~
INTPenis
Wow I'm amazed at how skewed my world view was. The first two comments I read
are praising ruby dependency management and scorning pythons requirements.

I'm sitting here thinking, "what the hell is wrong?". I've honestly only
experienced trouble with Ruby while being completely satisified with how
python virtualenv works.

I guess if anything this proves that it's about habit. Habitual use of
something makes it the easiest product for the habitual user. Ruby is
something I force myself through when I want to try a product while Python is
something I develop my own products in.

~~~
examancer
I used to think PHP was amazing and build tools were weird and unnecessary
when I built things mostly in PHP. It's hard to see the point of many tools if
you aren't working with the all the time.

I think this is the product of people who got to know both Python and Ruby
very well and found Python lacking here. There are lots of things Ruby
developers were gifted from people who also know Python and found Ruby
lacking. Python is generally something I force myself through so I'm not one
of those people but so glad they exist.

BTW, Ruby has tools similar to virtualenv: chruby, rbenv, and rvm all do
basically the same thing.

~~~
mypalmike
My experience has been quite the opposite wrt ruby and python. I spent 2 years
with Ruby as my primary language, and the regularity in which I would end up
with a subtly (or drastically) broken Ruby environment was astounding. With
python, I've rarely if ever run into such problems.

~~~
regularfry
The problem in Ruby that the most frequently recommended tools are
overcomplicated and break in horrible ways (but I repeat myself). You only
need to set a couple of environment variables to define a working Ruby
environment. I regularly have people ask me "do you use rvm or rbenv?" to be
surprised when I say "neither, they're both horrible."

------
blr246
As rickycook mentions in another comment, pip-tools is a great solution for
maintaining the distinction between allowed version ranges and locked, fully
qualified versions for the given environment.

I've been using pip-tools with tox for a couple of years now. I maintain a
requirements.in and requirements.testing.in, and then I can run

    
    
      $ tox -e pip-compile
    

to generate my fully qualified requirements. The pip-compile command is
handled by a tox.ini section.

    
    
      [testenv:pip-compile]
      commands =
          pip-compile {posargs}
          pip-compile -o requirements.testing.txt {posargs} requirements.testing.in
      deps =
          pip
          pip-tools
    

The remaining nasty part is automated extraction requirements for setup.py's
install_requires and dependency_links. I wrote a function to handle VCS links
and other complicated syntax that I'm copying around to all of my projects.
Otherwise, pip-tools has been a great solution.

~~~
StavrosK
Exactly, this is a much better way of doing pinning, if only because it's much
more human readable and easily parsable. I've been using it for a while as
well, and find it very convenient.

It seems to me that that's where we should be heading towards.

------
tlocke
I'm afraid I never saw what was wrong with specifying dependencies in
setup.py. For me, having requirements.txt and setup.py is confusing. Can't we
just stick with setup.py? (and yes, I've read Donald Stufft's post
[https://caremad.io/posts/2013/07/setup-vs-
requirement/](https://caremad.io/posts/2013/07/setup-vs-requirement/) and
remain unconvinced).

~~~
Sean1708
And since this Pipfile is just a bunch of function calls, why can't it just be
put in setup.py? Or is the idea to eventually get rid of setup.py?

~~~
nchammas
> And since this Pipfile is just a bunch of function calls

That hasn't been settled yet. It's being actively discussed [0].

> Or is the idea to eventually get rid of setup.py?

Yes, a replacement for setup.py has already been agreed on in PEP 518 [1].
It's called pyproject.toml.

[0]
[https://github.com/pypa/pipfile/issues/10](https://github.com/pypa/pipfile/issues/10)

[1]
[https://www.python.org/dev/peps/pep-0518/](https://www.python.org/dev/peps/pep-0518/)

~~~
theptip
That PEP leads down an interesting path -- hadn't realized there was a
completely new packaging tool called "flit" that's aiming to be much more
lightweight than setuptools.

[https://pypi.python.org/pypi/flit](https://pypi.python.org/pypi/flit)

Looks quite nice, a flat .ini file for specifying deps, though it only builds
wheels.

------
shadowmint
First of all, this is great, and I'm hugely in favour of something like this
going forward. Great work!

...however, I strongly disagree on the benefit of making the `Pipfile`
executable python. Just read this gist:
[https://gist.github.com/kennethreitz/4745d35e57108f5b766b8f6...](https://gist.github.com/kennethreitz/4745d35e57108f5b766b8f6ff396de85)

    
    
        - This file will be "compiled" down to json.
    

Then why does it exist?

We know it'll be abused; and we should have learnt our lesson from scons and
setup.py that it wasn't a great idea before, and still isn't a great idea
using python code itself as a declarative DSL. Just use a standard
hierarchical file format (json, toml, xml, whatever)

Features of introspecting and editing `Pipfile.lock` should be rolled into pip
and exported as a core python module; an api for editing pipfile.lock _is_ a
good idea, but executing a `Pipefile`, is not.

------
misterbowfinger
About time. The Ruby, Elixir, and Rust communities are far, far along in their
package management tools. Working with pip feels like going back in time these
days.

~~~
techman9
Even npm (despite its many flaws) is in far better shape than pip.

~~~
efrafa
I might be the only one, but I think npm is even better than bundler.

~~~
morgante
I'm with you there too. npm is by far the best package manager I've ever used.

~~~
LukaD
For me npm has always been the worst dependency manager I've used. What bugs
me the the most is that it installs all packages to node_modules by default.
It is possible to specify another location but then your application will
probably break because it has to know where your node_modules directory is.
Then there's the whole non-determinism thing: [https://docs.npmjs.com/how-npm-
works/npm3-nondet](https://docs.npmjs.com/how-npm-works/npm3-nondet)

Also check out
[https://github.com/npm/npm/issues/10999](https://github.com/npm/npm/issues/10999)
This is just insane!

~~~
derimagia
There are some issues with it, but think more about it's concepts. I agree at
the very least that npm is better than pip, but I feel like pip is really
outdated. I'm surprised so many people who are saying it's fine and that
setup.py is fine... I think it's just they're more familiar with it.

Check out yarn as well, some of it's practices are really awesome - getting
inspiration from bundler and cargo.

This is as someone who doesn't use either of these languages that much. Even
composer is better than pip IMO.

------
grincho
Don't forget that pip 8 and above have hash verification built-in if that's
all you need:
[https://pip.readthedocs.io/en/stable/reference/pip_install/#...](https://pip.readthedocs.io/en/stable/reference/pip_install/#hash-
checking-mode). A hash-happy `pip freeze` equivalent is
[https://pypi.python.org/pypi/hashin](https://pypi.python.org/pypi/hashin).

------
Animats
If you need old-version retrieval on a routine basis, library regression
testing has a problem. This gives package developers an excuse to break
backwards compatibility. Historically, Python has avoided that, except for the
Python 3 debacle.

~~~
Groxx
I'm a bit confused as well, unless you're referring to "installing libs at any
version except HEAD", in which case I vehemently disagree with you.
Deterministic builds are a must-have for production systems.

~~~
icebraining
The question is whether that should be included in the project, or as a
configuration of the deployment tool.

~~~
yxhuvud
There is no question. If it isn't included in the project, then any errors you
get are not repeatable.

~~~
icebraining
Why can't you get the versions in question from the origin of the error
(deployment config if it's an internal error, or asking the user if it's an
external report)?

~~~
yxhuvud
It is certainly possible, but adding a manual step for something that has no
reason not to be automatic doesn't make sense. Yes, you can certainly choose
to do more manual work for no reason, but why would you?

And if you don't want that (for example if you distribute a library), you can
still distribute the project without the lockfile, and then have the users
distribute the lockfile back to you to get a repeatable error.

------
mattlong
As someone who has been dealing with managing Python requirements a lot
lately, I was excited to see what Pipfile is all about. After reading the post
and all the comments here, it's still not really clear to me what the value
add is over existing solutions.

There are a lot of mentions of deterministic builds, but that is already very
achievable with pip-compile (part of pip-tools) or just pip freeze.

The grouping functionality allows you to have just one requirements file
instead of one per environment (i.e. production, test, development, etc) which
is mostly just a personal preference IMO. This isn't particularly compelling
to me, but that could be because I'm already used to the traditional
"pythonic" way of having one file per environment. Using the -r command within
a requirement file allows one to recursively include other requirements files
to avoid duplication of common dependencies across environments.

The difference in syntax between traditional requirements files and Pipfiles
is indeed pretty large. The Pipfile syntax is quite a bit more verbose which
I'm personally not a fan of, but this will come down to personal preference
and what one is familiar with.

It's unclear if Pipfiles as proposed here is meant to include the dependency
resolution functionality of the pip-compile command provided by pip-tools.
That is a very critical step as vanilla pip makes no guarantees about
respecting version pins of nested dependencies; only that _some_ version of a
nested dependency will be present but not necessarily the one intended.

Another big unknown that others have asked about as well is how Pipfiles can
be used to manage requirements for a library in a way that allows other
libraries/apps that do not use Pipfiles themselves to still list said library
in their requirements.txt.

Apologies if my comments comes across as overly negative or dismissive; I
applaud any effort to improve the tooling around Python dependencies. But as
someone already familiar with the Python/pip ecosystem, it's not clear how
this would improve or simplify the solutions that are already out there.

------
tuco86
I have not used ruby extensively, but a lot of times i had to install
something with it i got stuck with version conflicts of ruby itself and
package deps.

In my python workflow everything lives within a project virtualenv.
Dependencies are defined in setup.py with install_requires, extras_require and
tests_require. I build against latest, version constraints are added mostly
when the latest version of a package has a problem.

Now when i commit to dev, stage or prod branch our ci generates a version
pinned requirements.txt which is used to install the virtualenv on stage/prod.

I don't really see the improvements in Pipfile, feels like
[https://xkcd.com/927/](https://xkcd.com/927/).

~~~
teaearlgraycold
>but a lot of times i had to install something with it i got stuck with
version conflicts of ruby itself and package deps.

>In my python workflow everything lives within a project virtualenv

Were you using rbenv or rvm with Ruby?

~~~
tuco86
no, don't remember why that wasn't an option - maybe i was not even aware of
it. I am not really experienced in the ruby toolset, just trying to use tools
written in ruby.

~~~
Groxx
Yeah, sadly Ruby has the same environment issues as Python. Python has
virtualenv, Ruby has rbenv - they're essentially required, sometimes even for
global binaries.

For Python, there's `pipsi` for automatically creating unique virtualenvs for
binaries - that's the right approach, and it's worth adopting ASAP for future
sanity. It even lets you mix python 2 and 3 binaries without any issues. I
don't know what the equivalent would be for Ruby.

------
ftxdri
My initial reaction to reading this was "finally". Coming from a ruby
background I found the package manager of Python to be seriously lacking in
ease of use respects. Kudos to the pypa team!

------
fermigier
I've been using pip-tools for the last year or so. It does a similar job so
far. I'd be happy to switch to something that's supported out of the box by
pip, though.

------
nicois
this is what I have used for the past few years. I put my unversioned
requirements in a subdir and run this when I want to bump versions.

#!/bin/bash -x WHEELHOUSE="/usr/local/wheelhouse" [ -d "$WHEELHOUSE" ] || (
sudo mkdir -p /usr/local/wheelhouse/ ; sudo chmod -R 0777
/usr/local/wheelhouse/ ) deactivate set -e cd .requirements for reqfile in
requirements*txt ; do TEMPDIR="$(mktemp -d)" virtualenv -ppython3 "$TEMPDIR" .
"$TEMPDIR"/bin/activate pip install -U pip pip install -U wheel pip wheel
--find-links="$WHEELHOUSE" \--wheel-dir="$WHEELHOUSE" -r $reqfile pip install
--find-links="$WHEELHOUSE" -r $reqfile pip freeze | grep -v "pkg-resources" |
sort > "../$reqfile" rm -rf "$TEMPDIR" done wait

------
falsedan
This feels like putting the cart before the horse: proposing a tool to solve a
problem, without opening the problem to discussion, gathering
requirements/desired features, and comparing it to similar tools in other
languages.

I welcome any improvement to python packaging, but I'd start with improving
pip itself & making it into a library that tools can wrap around.

------
GlennS
My main problem with Pip is that, if you include multiple requirements.txt
files which declare exactly the same dependencies, then that's an error.

If you don't control all the projects involved, this needs some quite
convoluted inheritance chains to work around.

I can't tell if this change will fix that.

------
nubela
I don't understand the need for this. The main benefit of the Pipfile seem
minute, and can already be achieved with requirements.txt.

But I'm sure I'm missing something. Please feel free to convince me I'm wrong
:)

~~~
examancer
The main reason this is better is explained in the third paragraph in the
README: Deterministic builds. You can specify target versions in your Pipfile,
while your Pipfile.lock will contain the actual exact versions pip installed
(even the exact git commit) so if your app is built on another machine you
know the libraries are exactly the same, preventing slight version differences
from causing you problems without requiring you to be ultra-specific in your
dependencies.

Another benefit is named groups, which allow you to more succinctly specify
dependencies for various environments (dev/test/production/etc.) Along with
this you get the benefits of the lock file so you can be assured the subset of
libraries you use in production will be the exact tested libraries you use in
your larger dev/test environment.

~~~
witten
How is the Pipfile.lock distributed? Is it intended to be checked into source
control alongside the Pipfile? If so, how would that help someone pip install
my Python project (say from pypi) using that Pipfile.lock and get the benefits
of a deterministic build?

~~~
examancer
Yes, source control.

Installing the project would cause it to be built from the lock file (assuming
the Pipefile hasn't changed). This will mean your users won't get "some
version" of a library you depend on between verion 1.0 and 2.0 or whatever
range you specified. They'll get the exact package you last successfully used
and checked in yourself, right down to the git commit if applicable.

Once you modify the Pipfile then pip will resolve your dependencies and try to
make the specified changes by adding/removing/upgrading packages. If Pypa
continues following bundler conventions this will be done by making the fewest
changes possible from your existing versions in your lock file. You'll also
have an upgrade mode where pip will rebuild you project from the Pipfile
looking for the most recent versions of all libraries or a specified library
within your specified version ranges. When done and your app/tests are
working, check in the new version and you can ensure all your
users/environments will be able to upgrade cleanly.

------
witten
This seems like a huge step backwards to me. Why would you want to go from a
parse-able, machine-readable, data-driven syntax to something that can't
easily be introspected, isn't machine-readable without firing up a full Python
interpreter, is as flexible as actual code and is thus subject to all the
abuse you can introduce with actual code, etc..

From the original prototype's comments:
[https://gist.github.com/dstufft/2904d2e663461f010bbf](https://gist.github.com/dstufft/2904d2e663461f010bbf)

"\- If there's a corner case not thought of, file is still Python and allows
people to easily extend"

... and ...

"\- Using Python file might cause the same problems as setup.py"

Ugh.

Also, how is the "lock file" actually distributed? Unless you can pip install
a wheel and have it include an embedded lock file, then you've still got to
have some out-of-band mechanism for copying the lock file around like you
would a requirements.txt or even a fully-fledged virtual environment.

~~~
examancer
Your lock file goes into version control. This ensures all your developers and
environments are using the exact same version of every library. You generate a
new lock file when upgrading/adding libraries.

Don't be so scared. Ruby developers have been doing this for over 7 years now.
Trust us, this is much better. Your Python friends at Pypa are copying Ruby
because dependencies are less painful there than just about anywhere. They are
still painful, but this will help.

~~~
witten
I'm not sure age is a good indication of goodness in this regard. Python has
been doing setup.py for much longer than 7 years. It's a similar format, and
ripe for abuse because you have full access to a general purpose programming
language in it. I've seen setup.pys call out to Inkscape. I don't want a
general purpose programming language in my requirements.txt (or equivalent). I
want a dumb declarative data file.

Also, I'm not sure how a lock file in source control will help me when someone
who is not checking out source wants to pip install my project using said lock
file.

~~~
examancer
Age is a good indication if there were serious issues with this approach we
would have found them.

There seem to be a lot of strange fears from python developers about this
approach and my comment on age was an attempt to assuage those fears. Since
ruby developers are quite happy with bundler (at least in comparison to other
communities) and have been for some time it's a reasonable point and not the
only one I made.

I'm not sure how pip plans to use the lock file for projects distributed via
pip. They may very well have a solution planned for that.

However, for projects that are distributed by source (which are many) the lock
file ensures deterministic builds, and requirements.txt generally doesn't
without being pedantic with your versions.

~~~
scarygliders
> There seem to be a lot of strange fears from python developers

There you go again.

This isn't fear. It's asking the question of: What. Is. The Damn. Point. Of.
This?

I have yet to see a simple, cogent explanation of why this is better than a
list of requirements specifying;

pyside==x.y.z

pillow==x.y.z

As opposed to this proposed Thing.

You know what this reminds of?

It reminds me of replacing super simple .INI text files with configuration
files stored as XML - change for the sake of it, because Change!

Now, I'd love to see a very simple explanation and justification on why we
should all move to this new Thing you're making.

Something like "This new Thing is better because..." , followed by practical
/examples/ , because right now, all I see is More Complicated Stuff For The
Sake Of It.

~~~
examancer
There were clear fears/risks/concerns stated by witten directly above. I
didn't invent the idea of fears being stated out of whole cloth.

> subject to all the abuse you can introduce

> might cause [...] problems

I've said a few things about the potential benefits, though I am not involved
in the project. The actual authors offer the best list of benefits:
[https://github.com/pypa/pipfile#the-
concept](https://github.com/pypa/pipfile#the-concept)

They also point out this will eventually be built into pip, so it will have
the benefits of all the existing workarounds like pip-tools without any setup.
It does seem like a lot more than change for the sake of change, and along
with it since there is so much change why not take the opportunity to adopt a
flexible DSL built for future extension so there is less real change in the
future.

------
SonOfLilit
I have utmost respect for the author, so much that I blindly trust the result
will be great. (He made the requests package, among other things.)

Just one request: please, please add a --save option to pip like npm's.

------
sleek
All I ask is that if you add a freeze command, make sure that it remembers a
git-style install of a package so that it yields the right directive later on.
(the full git path and ref, NOT the egg name)

------
cx1000
While we're on the topic... the name requirements.txt is so generic and
undescriptive. It breaks the principle of least astonishment. Something like
packages.pip would make more sense.

~~~
msane
It contains requirements. The format is custom and it's text.

~~~
cx1000
> The format is custom

What does that even mean?

A .txt file is meant for human readable sentences, yet this file is meant to
be consumed by pip. You can only "customize" it working within the confines of
pip freeze. If you were new to programming, would you suspect that
requirements.txt was somehow linked to Python? Probably not.

Why can't ruby or JavaScript or any language make use of requirements.txt?
They could, and make the same claim you just made, which would break pip
install -r.

~~~
dismantlethesun
Text simply means text format, not that "humans should read this". The human
editability of text is a nice-to-have, but not a requirment of the format
structure.

For a comparision, NPM has a packages.json which is hand-editable, and lots of
people do customize it by hand to get their NPM scripts running, but:

a) If you're new to programming, you might be confused by which language it
belongs to just like a requirements.txt

b) if you peek into it, you may not think it's generated, consumed, or editted
by machines because JSON format is so readable

~~~
cx1000
I never said humans _should_ read it, just that it's human readable. And of
course requirements.txt is technically human readable, but the problem is that
it breaks scripts if you do anything other than pip freeze to it. This is
obvious in the npm world (you better put JSON in a .json file!).

My point is that the .txt extension doesn't give any hints to its use or that
it's breakable, and that the name "requirements" doesn't describe the intended
domain of the file (Python only). Even naming it "python_requirements.txt"
would be an improvement.

------
orblivion
I wonder if they're using the term "deterministic build" in the same way that,
say, Debian uses it, and whether they should choose something else to describe
this.

------
joss82
What is wrong with requirements.txt seriously?

I've used it in countless projects and it just works. It's also simple to
understand.

~~~
somecallitblues
Installing from GitHub is a pain and if you upgrade packages and freeze into
requirements.txt you can't tell which versions you've overwritten. You can
easily stuff this up with a deploy script.

~~~
kenneth_reitz
The problem is that it can be used in both a deterministic and non-
deterministic way (no version-specifiers specified, not all sub-dependencies
listed). People want the best of both worlds.

This facilitates that.

------
dimino
Very happy about this, the Pypa crew are saving Python from itself.

------
breerly
Wth happened to pip-tools? The Python ecosystem is a mess...

------
SFJulie
json + code of conduct? .... Oh dear, python going on its way to the ruby
ghetto with security holes to predict.

