
Announcing Pipenv - imkevinxu
https://www.kennethreitz.org/essays/announcing-pipenv
======
cderwin
This is great, but sometimes I think that python needs a new package manager
from scratch instead of more tools trying to mix and mash a bunch of flawed
tools together in a way that's palatable by most of us. Python packaging
sucks, the whole lot of it. Maybe I'm just spoiled by rust and elixir, but
setuptools, distutils, pip, ez_install, all of it is really subpar. But of
course everything uses pypi and pip now, so it's not like any of it can
actually be replaced. The state of package management in python makes me sad.
I wish there was a good solution, but I just don't see it.

Edit: I don't mean to disparage projects like this and pipfile. Both are great
efforts to bring the packaging interface in line with what's available in
other languages, and might be the only way up and out of the current state of
affairs.

~~~
renesd
I think python packaging has gotten LOTS better in the last few years. I find
it quite pleasurable to use these days.

From binary wheels (including on different linux architectures), to things
like local caching of packages (taking LOTS of load off the main servers). To
the organisation github of pypa [0], to `python -m venv` working.

Also lots of work around standardising things in peps, and writing
documentation for people.

I would like to applaud all the hard work people have done over the years on
python packaging. It really is quite nice these days, and I look forward to
all the improvements coming up (like pipenv!).

I'd suggest people checkout fades [1] (for running scripts and automatically
downloading dependencies in a venv), as well as conda [2] the alternative
package manager.

[0] [https://github.com/pypa/](https://github.com/pypa/)

[1]
[https://fades.readthedocs.io/en/release-5/readme.html#what-d...](https://fades.readthedocs.io/en/release-5/readme.html#what-
does-it-do)

[2]
[http://conda.pydata.org/docs/intro.html](http://conda.pydata.org/docs/intro.html)

~~~
sametmax
+1. Relatively to what we have before, it's so much better. But compared to
the JS/Rust ecosystem, we are behind.

Now it's hard to compete with JS on some stuff : it's the only language in the
most popular dev plateform (the web) and it has one implicit standardized
async model by default.

It's hard to compete with rust on some stuff : it's compiled and is fast, can
provide stand alone binaries easily and has a checker that can avoid many
bugs.

But this. The package manager. We can compete. And yet we are late.

It's partially my fault since it's a project I had in mind for years and never
took the time to work on. It's partially everybody's fault I guess :)

~~~
HowDoesItWork
Wow, Javascript, really? I am guessing you don't actually work with NPM a lot.

~~~
jessaustin
One suspects it's you who hasn't distributed or installed many modules on
either python or node. _So many_ of the problems that python has, simply don't
exist for node, because it finds modules in a bottom-up hierarchical fashion.
That allows a single app or module to use modules that in turn use different
versions of other modules, and not to worry about what other modules are
doing, or how other modules are installed, or how node is installed, or what
version of node is installed. This prevents the traditional "dependency hell"
that has plagued devs for decades. Thanks to tools like browserify and
webpack, the browser may also benefit from this organization.

On top of all that, npm itself just does so many things right. It's quite
happy to install from npm repos, from dvcs repos, from regular directories, or
from anything that looks like a directory. It just needs to find a single file
called "package.json". It requires no build step to prepare a module for
upload to an npm repo, but it easily allows for one if that's necessary.
package.json itself is basically declarative, but provides scripting hooks for
imperative actions if necessary. At every opportunity, npm allows devs to do
what they need to do, the easy way.

In a sense, node and npm are victims of their own quality. The types of
"issues" (e.g. too many deps, too many layers of deps, too many versions of a
particular dep, deps that are too trivial, etc.) about which anal code
puritans complain with respect to node simply couldn't arise on other
platforms, because dependency hell would cause the tower of module
dependencies to collapse first. node happily chugs along, blithely ignoring
the "problems".

Personally, I used to be able to build python packages for distribution, but
since I've been spoiled by node and npm for several years I've found I simply
can't do that for python anymore. It is so much harder.

~~~
philsnow
npm has its own special problems. disclaimer: what I'm talking about in this
post is at least six months old, which in node/npm/js world is ancient
history.

> it finds modules in a bottom-up hierarchical fashion. That allows a single
> app or module to use modules that in turn use different versions of other
> modules, and not to worry about what other modules are doing

To my understanding, if your app transitively depends on package foo-1.2 in
thirty different places [0], there will be thirty copies of foo-1.2 on disk
under node_modules/ . Each package reads its very own copy of foo-1.2 when it
require()s foo.

On a large app, that adds up to a lot of inodes ("why does it say my
filesystem is full? there's only 10G of stuff on my 80G partition!" because
it's used up all its inodes, not its bytes.) and a _lot_ of unnecessary I/O.
The second through thirtieth copies of foo-1.2 don't come from the kernel's
block cache "for free", they come from spinning rust (or if you're lucky, the
dwindling number of IOps your SSD can choke out. Do you pay money for
provisioned IOps?).

[0] and thirty is a lowball number for some projects, especially given the
community's preference to require "leftpad" or whatever instead of writing a
couple lines in their own projects

~~~
jessaustin
_...what I 'm talking about in this post is at least six months old..._

Haha npm@3 was out June 2015. b^)

I agree that it would have been better, on balance, for previous versions to
have created hard links to already-installed modules. Actually that wouldn't
be a bad option to have even now, since debugging is often easier when one has
a deep directory structure to explore rather than hundreds of random names in
the top-level node_modules directory. That is, if I know the problem is in
module foo, I can pushd to node_modules/foo, find the problematic submodule
again, and repeat until I get all the way to the bottom. [EDIT: it occurs to
me that having all these hard links would make e.g. dependency version updates
easier, since un-updated dependencies wouldn't have to be recopied, unix-stow-
style.]

To me, the more amusing file descriptor problem is caused by the module
"chokidar", which when used in naive fashion tries to set up watches on all
360 files and directories created by itself and its own 55 dependencies. At
that point it's real easy to run out of file watches altogether. Some of the
utilities that call chokidar do so while ignoring node_modules, but many do
not.

------
shakna
> I wrote a new tool this weekend, called pipenv.

> It harnesses Pipfile, pip, and virtualenv into one single toolchain. It
> features very pretty terminal colors.

For a weekend project, this has some very nice things.

Which removes the need for me to run my own project that basically does these
things... In more or less, a worse way.

Everything I've come to expect from Reitz, and hopefully it'll gain some
decent ground like other projects of the same author.

------
therealmarv
For people who want to do it right without using an additional tool read this:
setup.py vs. requirements.txt by Donald Stufft
[https://caremad.io/posts/2013/07/setup-vs-
requirement/](https://caremad.io/posts/2013/07/setup-vs-requirement/)

~~~
yeukhon
I gave up populating requirements in setup.py. I just use multiple
requirements.txt. This article has been debated for years already and there is
absolutely no right / wrong.

------
kalefranz
Hey everyone. I'm Kale, currently the lead developer on the conda project.
It's been mentioned a few times in this thread, and I just want to make sure
that any questions about it are answered accurately. Feel free to ask me
anything about product conda. Thanks!

------
renesd
Neat. Now for questions and comments.

Often people have a requirements.live.txt, or other packages depending on the
environment. Is that handled somehow? Can we use different files or sections?
[ED: yes, different sections]

Still wondering to myself if this is worth the fragmentation for most people
using requirements.txt ? Perhaps the different sections could have a "-r
requirements.txt" in there, like how requirements.dev.txt can have "-r
requirements.txt". [ED: the pipfile idea seems to have quite some people
behind it, and pip will support it eventually. Seems it will be worth it to
standardise these things. requirements.txt is a less jargony name compared to
Pipfile though, and has a windows/gui friendly extension.]

Other tools can set up an environment, download stuff, and run the script.
Will pipenv --shell somescript.py do what I want? (run the script with the
requirements it needs). ((I guess I could just try it.)) [ED: doesn't seem so]

Why Pipfile with Caps? Seems sort of odd for a modern python Thing. It looks
like a .ini file? [ED: standard still in development it seems. TOML syntax.]

With a setup.py set up, all you need to do is `pip install -e .` to download
all the required packages. Or `pip install somepackage`. Lots of people make
the setup.py file read the requirements.txt. Do you have some command for
handling this integration? Or is this needed to be done manually? [ED: seems
no considering about this/out of scope.]

Is there a pep? [ED: too early it seems.]

~~~
kenneth_reitz
Check out [https://github.com/pypa/pipfile](https://github.com/pypa/pipfile)
for more info. The format is TOML. It is mainly intended for deployments (e.g.
web applications)

~~~
renesd
Thanks.

~~~
kenneth_reitz
Also `pipenv install -r requirements.txt` is supported, for importing.

~~~
renesd
That's nice.

Have you seen fades [0]?

It lets you do things like::

    
    
        fades --requirement requirements.txt myscript.py
    

You can also mark the dependencies in your myscript.py as comments. The use
case is for quick one off scripts which may require a bunch of different
requirements. The integration with ipython is nice for experimenting too. I
think pyenv could be useful for that use case of little one off experiments
too.

One enhancement I'd like for it to also run modules or entry points like:

    
    
        fades --dependency pygame -m pygame.examples.aliens
    
    

Setuptools entry points like used in console_scripts to work would be nice
too:

    
    
        fades --dependency pyenv -m pyenv.cli:cli
    

Or the best::

    
    
        fades -m pygame.examples.aliens
    
    

So it can create a virtualenv, see that it needs the package (pygame here),
and then install that package with pip, and then run the module or entry
point.

Anyway... just some dreaming. Can I haz pony? All this is exciting.

[0] [https://github.com/PyAr/fades](https://github.com/PyAr/fades)

------
conradev
I'm surprised that no one has mentioned pip-tools:
[https://github.com/nvie/pip-tools](https://github.com/nvie/pip-tools)

It's a very similar set of tools. I use pip-compile which allows me to put all
of my dependencies into a `requirements.in` file, and then "compile" them to a
`requirements.txt` file as a lockfile (so that it is compatible with pip as
currently exists).

This looks great, though, I'm excited to check it out!

~~~
FabioFleitas
We use pip-tools on all our Python projects and it works great. I believe the
requirements.in compiled to requirements.txt approach is much more sane and
less error-prone.

------
throw2016
I think an app should not expose end users to its dependencies. That leaves
the end user with a lot of pain figuring out versions of dependencies and god
forbid you need to compile some dep then you need a build environment and its
dependencies any of which can fail in this chain leaving a very unpleasant and
even hostile end user experience.

Ruby and Node apps are particularly guilty of this pulling in sometimes
hundreds of packages some of which need compilation. Compare that to a Go
binary which is download and use. These things can get very complicated very
fast even for developers or system folks let alone end users who may not be
intimately familiar with that specific ecosystem.

------
choxi
Is this like Ruby's Bundler for Python? I've just been getting into Python and
am really glad to see this, thanks for creating it!

~~~
igravious
Very similar. I think Pipenv improves on Bundler by leveraging Virtualenv and
Ruby doesn't have a per project equivalent to Virtualenv that I'm aware of.
You can set the path config variable of Bundler to not place the project Gems
in a central location which I think is cleaner and try to remember to always
do now.

It would be _super_ interesting if the Python and Ruby communities got
together to harmonize every last detail of their packaging toolchain. Who is
in?

~~~
mhw
> Ruby doesn't have a per project equivalent to Virtualenv that I'm aware of.

The nearest equivalent is to place a file called '.ruby-version' in the top
level directory, containing the version number of the Ruby you want to use.
Version numbers come from [https://github.com/rbenv/ruby-
build/tree/master/share/ruby-b...](https://github.com/rbenv/ruby-
build/tree/master/share/ruby-build). rbenv, chruby and rvm all support .ruby-
version.

One difference from virtualenv is that the Ruby version managers share single
installations of each version of Ruby. My understanding from occasional use of
Virtualenv is that it copies the python installation into a new per-project
subdirectory, which seems a bit wasteful to me.

> You can set the path config variable of Bundler to not place the project
> Gems in a central location which I think is cleaner and try to remember to
> always do now.

Yes, this is what I do. It gives me a shared 'clean' Ruby installation of the
right version, plus a project-specific copy of all the gems the project
depends on. To me this provides the best trade off between project isolation
and not duplicating the whole world. You can set bundler up so this is done
automatically by creating '~/.bundle/config' containing

    
    
        ---
        BUNDLE_PATH: "vendor/bundle"
        BUNDLE_BIN: ".bundle/bin"
    

(The BUNDLE_PATH one is the important one; see 'bundle config --help' for
other options.)

~~~
pkd
> It gives me a shared 'clean' Ruby installation of the right version, plus a
> project-specific copy of all the gems the project depends on

You can also accomplish the same using gemsets which are provided by rvm.

~~~
igravious
Using RVM is not always an option and some might consider it an anti-pattern.

------
gourneau
Hey other Reitz fans. Make sure to check out his newish podcast series:
[https://www.kennethreitz.org/import-
this/](https://www.kennethreitz.org/import-this/)

------
olejorgenb
Seems to be a python specific nix-shell like tool?

With nix[OS] you just run `nix-shell -p python[2,3] python[2,3]Pacakges.numpy
...` to get an environment with the required packages.

Of course this requires that the python library is packaged in nix, but in my
experience the coverage is quite good, and it's not very hard to write
packages once you get the hang of it.

It also possible (but currently a bit clumsy in some ways) to set up named and
persistent environments.

------
command_tab
See also: [https://github.com/pypa/pipfile](https://github.com/pypa/pipfile)

I'm glad to see Python getting the same attention as other modern package
managers. This is all great work!

------
caconym_
I will definitely be trying this out. Python version and package management is
a dumpster fire that wastes gobs of my time on the regular. I'll try anything
that promises to end the pain.

------
istoica
Finally someones does it! I was using: pip -t .pip in my code, avoiding
virtual-env completely, but that was not enough and incomplete.

As this is not cross platform and it would be nice to switch between
Linux/Windows while coding to maintain platform compatibility, can the
virtualenv envs be created with a os platform & subsystem prefix ? for
example, having multiple envs at once:

    
    
      - env/posix/bin/activate
      - env/nt/Scripts/activate.bat

------
zoul
I always wonder if this could be done once and for all languages, instead of
Ruby making bundler, Haskell Cabal sandboxes or stack, Perl Brew, etc. Is this
where Nix is going?

~~~
toyg
You make it one tool, and sysadmins will instantaneously lock it down. These
package managers, in most cases, are developer tools built to get around
system-wide locks on libraries; the more you centralize them, the more likely
it is they will get locked down, and then someone will build tools to get
around that, and so on and so forth.

~~~
olejorgenb
As one of the few package mangers, nix can be used without root access.

------
jaybuff
See also, pex:
[https://www.youtube.com/watch?v=NmpnGhRwsu0](https://www.youtube.com/watch?v=NmpnGhRwsu0)

------
nejdetckenobi
normally I use virtualenvwrapper and that makes a virtualenv directory for all
virtualenvs you create with it. before that, I always create my projects'
venvs inside my project hierarchy.

I had a dilemma about it. But after all, you can not move your venv directory
unless you use `--relocatable` option. So, anyone have a strong argument about
creating venvs inside your project directory?

~~~
icebraining
I've always found the whole virtualenv stuff so superfluous. Do we really need
all the machinery with shell scripts and static paths?

We just use a directory where we keep our dependencies. It's a matter of:

    
    
        mkdir libs
        pip install -t libs <package>
        # then to run
        PYTHONPATH=libs python app.py
    

From what I can tell, this accomplishes everything a venv does (except
bringing the Python interpreter itself along) without requiring any extra
tools or conventions to learn.

~~~
istoica
As you pasted-it, it does not accomplish venv:

    
    
      - you still need to change PYTHON_PATH to recognize libs from `libs`
      - packages have bin scripts sometimes that most likely, will be needed by the project
    

A more proper command: \- pip install -t ./libs --install-option="\--install-
scripts=./bin"

But still, does not solve the PYTHON PATH issue, it won't be solvable because
all python cli tools and all scripts in ./bin must be aware of PYTHON_PATH
including ./libs

This is what venv does and pip alone cannot easily solve, replicating an
entire python environment that is aware of project local pip packages

~~~
icebraining
My commands do set PYTHON PATH when running the app.

As for the cli tools, fair enough, but the packages I use with such
tools/scripts are full applications, not dependencies to be included in
another project. Which kinds of packages do you see as both dependencies and
containing CLI tools?

------
sametmax
I was really not a fan of the last "made in Reitz" project Maya. But this, I
really can get along.

The whole things make it way easier to get started for a beginner. Now more
activate. No more wondering about virtualenv. Automatic lock files are great
since no project I know of use them since they are not well understood.

It's like node_packages (easy and obvious), but cleaner (no implicit magic).

Like.

------
georgeaf99
LinkedIn has a similar open source project that is much more mature. It builds
on Gradle features to manage complex dependencies and build Python artifacts.
If you include this LinkedIn Gradle Plugin [1] you can automatically run tests
in a virtual env and source a file to enter the project's virtual env.

PyGradle [2]: "The PyGradle build system is a set of Gradle plugins that can
be used to build Python artifacts"

[1]
[https://github.com/linkedin/pygradle/blob/01d079e2b53bf9933a...](https://github.com/linkedin/pygradle/blob/01d079e2b53bf9933aa786af1ec7dabcf964c669/docs/python.md)

[2]
[https://github.com/linkedin/pygradle](https://github.com/linkedin/pygradle)

------
helb
_> \--three / \--two Use Python 3/2 when creating virtualenv._

I use Python 2.7, 3.4, and 3.5 on various projects. Is there a way to choose
between 3.4 and 3.5 using Pipenv? I'm using something like this with
virtualenv:

    
    
      $ virtualenv -p `which python3.5` .venv

~~~
iheredia
Just a comment. Your command is equivalent to

    
    
      $ virtualenv -p python3.5 .venv

~~~
emeraldd
Is that true? His command would evaluate the path at the time which is run,
but yours seems to evaluate the path at the time python3.5 is run. Subtle, but
very different results.

~~~
kyrias
virtualenv resolves the interpreter when you run virtualenv, and if you
specify a different interpreter with `-p` it will resolve the path to the
interpreter and then run virtualenv again using it.

------
dvl
Why people are trying to complicate things? requirements.txt are much simpler
and better.

~~~
sametmax
First, you need to take care of the virtualenv manually. This tool avoid that.
You can completly ignore the virtualenv. You don't even have to populate
requirements or learn about pip freeze.

Secondly, this tool allow you to freeze your requirement list at specific
versions. So in your req file, you have the name of the packages you depend
on. Bon on the req lock, you get all the pulled dependencies recursively with
the version you are using right now. The first one let you dev more easily.
The second one deploy more easily.

All that, for less complexity. Win win.

------
wyldfire
I haven't really used requirements.txt because I found that I could install
'extra' and 'test' specific content based on args to setup() in my setup.py.
It seems more like the Right Thing than requirements.txt, from what I can
tell.

At first glance, this doesn't seem to offer anything beyond what I already see
from setup(). What am I missing?

It's unfortunate that CPython gave us distutils and took a very long time to
converge on a built-in successor (setuptools?) that gives the right
composability.

------
noway421
This is very interesting! I had exact same question how to do it in python
just a while ago! [http://stackoverflow.com/questions/41427500/creating-a-
virtu...](http://stackoverflow.com/questions/41427500/creating-a-virtualenv-
with-preinstalled-packages-as-in-requirements-txt)

Glad that someone thought about similar thing and made a tool to solve it!

------
JeremyBanks
> Otherwise, whatever $ which python will be the default.

This is a bit strange because the python binary is always supposed to be
Python 2. The Python 3 binary is supposed to be named python3. Some
distributons don't follow this, but they're the weird non-conformant ones;
it's not a behaviour that should really be relied on.

~~~
jacobmischka
_> always supposed to be Python 2_

This is not correct. It's a symlink to python2 on systems that rely on calls
to python to be python2. On modern systems, python is usually a symlink to
python3. This is the case on Arch Linux and I believe other recent distro
releases.

~~~
JeremyBanks
Arch is the oddball. I'm not aware of anybody else who's made the switch.

Given that they're not mutually compatible except in rare cases, it's a very
silly thing to do. You can upgrade GCC with the same name because you know it
will handle most of the same input. If you do that with Python you're breaking
tons of existing scripts for very close to zero benefit. Why would you do
that?

~~~
jacobmischka
It's not silly. One of Arch's primary draws is being the first to get updates
and integrate new technologies. Other distros will follow soon enough if there
really aren't any others.

It's not as if python2 suddenly doesn't exist, it's not terribly life-changing
to add a 2 in the places where the scripts are still on the last version.

I'm glad to be on a system where the default Python version isn't one that
will be officially unsupported in three years.

Edit: It's worth pointing out that this switch was made _6 years ago_ and the
world is still spinning.

[https://www.archlinux.org/news/python-is-now-
python-3/](https://www.archlinux.org/news/python-is-now-python-3/)

~~~
JeremyBanks
They can adopt Python 3 and drop Python 2 without adding this link which
encourages a convention of /usr/bin/env python for Python 3 on their platform,
creating unnecessary incompatibility with other platforms.

As you said, it's old news now, and not the end of the world. But I'm hoping
others don't follow.

------
mikhuang
Uninstall by default removing everything seems a little scary. Otherwise looks
really neat, looking forward to trying it

~~~
bdukic
I wanted to comment on the same thing, IMHO something like `pip uninstall
--all` would be a better choice. Great work otherwise!

------
therealmarv
Currently using fish and virtualfish. This seems incompatible to non bash
shells. Did somebody tested?

~~~
kenneth_reitz
Fish support was just added!

------
btashton
Interesting. I have been leveraging tox to provide a lot of what this seems to
give you, but it certainly has been more of a hack than a solution.

------
natmaster
This is cool - it's like yarn for python! :)

------
zyxzkz
What took Python so long to get a tool like this?

~~~
sametmax
Did you work on it ?

No.

That's why.

We can't always wait for one famous guy to do it.

------
lukasm
Did anyone had a problem installing in on El Captain? I manage to do it with
--user flag but the pipenv command doesn't exist.

------
zephyrfalcon
A bit off-topic, but what font is used in the video/animated GIF?

~~~
kenneth_reitz
Operator Mono! [https://www.kennethreitz.org/essays/test-
driving-a-200-progr...](https://www.kennethreitz.org/essays/test-
driving-a-200-programming-font-operator-mono)

------
auston
I can only express my gratitude, thank you Kenneth!

------
korijn
Windows support?

~~~
kenneth_reitz
coming soon, hopefully via pull request!

