
Pipenv: promises a lot, delivers very little - BerislavLopac
https://chriswarrick.com/blog/2018/07/17/pipenv-promises-a-lot-delivers-very-little/
======
legostormtroopr
Pipenv is a really interesting development for Python, and I'm glad that
someone was working to improve dependency locking for Python.

However, Kenneth abused his position with PyPA (and quickly bumped a what is a
beta product to version 18) to imply Pipenv was more stable, more supported
and more official than it really was.

And worse still, for anyone saying "but ts open source, you get what you pay
for", Kenneth as former Python Overlord at Heroku, encouraged Heroku to place
Pipenv above Pip as the default Python package manager in the Python
buildpack. This decision impacted paying customers and the Python buildpack
used a broken version of Pipenv for a long time. So long, most people I know
just went back to Pip.

Then, lastly, when people complained he had a tizzy at reddit and twitter and
got PyPA to help backtrack and say "no we didn't support it, nope, its just a
thing that happened", all while the main Pipenv Github repository was held
under the PyPA GitHub Org.

~~~
greysteil
Sometimes, improvements don't happen in a straight line.

There's been a lot of work on Pipenv over the the last 6 months, predominantly
by Dan Ryan and Tzu-Ping Chung, and it's getting stronger and stronger with
each release.

If you've gone back to using pip I'd encourage you to give Pipenv another try.
Introducing a lockfile is a big step forward for Python dependency management,
and the team working on Pipenv are committed and doing a great job.

~~~
rendaw
Pipenv tries to upgrade all the versions of everything in your lockfile
whenever you add a new package (not just dependencies of the package), and
there's no way to disable this behavior. Right now. That's the tip of the
iceberg.

~~~
greysteil
You might want to check out
[https://github.com/pypa/pipenv/pull/3304](https://github.com/pypa/pipenv/pull/3304),
which Dan is actively working on to solve that.

------
dfee
I was sorely disappointed with pipenv, and transitioned to poetry [1], with
which I’ve been very satisfied. There is also some commentary in the README on
the design decisions re: pipenv [3]. Contrary to the author's perspective on
poetry using poetry-specific sections of pyproject.toml, that's actually the
proper implementation (and expected usage) coming out of PEP-518.

I also am a big fan of pyenv [3] but that’s of course to manage Python
versions (not environments)

[1] [https://github.com/sdispater/poetry](https://github.com/sdispater/poetry)

[2] [https://github.com/sdispater/poetry#what-about-
pipenv](https://github.com/sdispater/poetry#what-about-pipenv)

[3] [https://github.com/pyenv/pyenv](https://github.com/pyenv/pyenv)

~~~
epage
Poetry is one of the few things that gives me hope about the mess of python
packaging.

It is also great to see the author is very responsive.

My only concern is the lack of integrated "toolchain" management (what versxon
of python to use, something like rustup) that is cross platform.

~~~
RayDonnelly
I agree that providing toolchains is very important.

The only non-system package manager that provides Python and its own
toolchains - for Linux and macOS presently - which are used to compile every
C, C++ and Fortran package, including Python itself is conda and the Anaconda
Distribution.

Not doing this leads to static linking and that's inefficient and insecure.

Disclaimer: I work for Anaconda Inc.

~~~
Nullabillity
> The only non-system package manager that provides Python and its own
> toolchains - for Linux and macOS presently - which are used to compile every
> C, C++ and Fortran package, including Python itself is conda and the
> Anaconda Distribution.

Nix[0] is also perfectly usable without NixOS, and provides all of that, but
has far more non-Python libraries and applications packaged. It's also not
constantly trying to sell you an enterprise version...

[0]: [https://nixos.org/nix/](https://nixos.org/nix/)

~~~
RayDonnelly
Great I'll try it out, I always meant to but never got round to it. Does it
work on macOS or Windows yet? What's the oldest Linux distro upon which it
will run?

Not sure we constantly try to sell our Enterprise product. You could look at
it less cynically as we sell an Enterprise product to allow us to provide the
Anaconda Distribution for free.

~~~
chriswarbo
A lot of Nix users seem to use Mac, based on the stuff that comes up on the
mailing list (discourse).

There's no "native" Windows support (yet), but I think it might work with some
of the UNIX emulations (cygwin, mingw, wsl, etc.)

Not sure what the oldest working Linux version would be. However, NixOS has
been around since 2003, so maybe quite old.

------
umvi
I hate python package/dependency/virtualenv management so much. Even
JavaScript is preferable.

I find npm very easy to wrap my head around. I do npm install ___ and it does
a lookup in its repository and downloads it and its dependencies to
node_modules. If I want to start fresh, I simply delete node_modules.
Everything else "just works" when I invoke node myapp.js. Punto e basta.

pipenv masquarades as the same thing, but then there is no python_modules
folder to be found. Instead it downloads it to some obscure directory halfway
across my computer. If I want to start fresh I guess I have to trust the
tool's uninstall function? And it's unclear if I need sudo or not. Also I can
no longer just run my program, I have to run it with pipenv now? Python
requires too much cognitive burden for module/dependency/virtualenv
management.

For me it's to the point that developing using a docker image with globally
installed python modules is easier to manage and wrap my brain around than
using pipenv/virtualenv/whatever.

~~~
wirrbel
Basically it seems you are lacking only two shell commands.

    
    
        virtualenv venv
        . venv/bin/activate
    

Then everything works with pip

    
    
        pip install numpy
    
    

I actually prefer the pipenv way (or the virtualenvwrapper way) of putting
virtualenvs into a dedicated location, so that I can wipe them off the hard
disk if I want to free some space.

The real interesting and occasionally bothering point on managing your
virtualenv is resolution of dependencies and compatibility ranges (hard to
solve in general).

~~~
icebraining
At a previous job, we used an even simpler solution: just download everything
to a directory (called "deps") with

    
    
        pip install --target deps -r requirements.txt
    

Then running the application with

    
    
        PYTHONPATH=deps python myapp.py
    

This was also simple to integrate with service supervisors, since most make it
easy to configure a environment variable.

~~~
ProblemFactory
Virtualenv tip: you can skip the Bash magic of calling "activate" by providing
the path to the python executable.

    
    
        /hello/venvs/myproject/bin/python myscript.py
    

will run myscript.py exactly as if the virtualenv was active, which is easier
to configure for cron jobs and services.

------
coleifer
The problem, in my view, is not that pipenv is buggy or flawed or changes too
rapidly. The problem is that it was hyped-up with overblown promises and
marketing and sycophantic testimonials. The author of the project describes
this himself in his "letter to r/python":

> ...my fame, while certainly categorized under “cult of personality” is not
> necessarily accidental. It’s called marketing. I worked very hard at
> becoming well known within the Python community, and toiled away at it for
> years.

I see this as the real issue, which led to premature adoption of a tool that
wasn't stable, and the subsequent backlash.

Also, I have been using virtualenv for years and if I want to freeze my
dependencies, running `pip freeze > requirements.txt` is sufficient for me.

------
metasyn
We recently switched to using poetry over pip and virtual envs. All the
additional tooling poetry provides is amazing - actual dependency resolution,
script entry point, building, version bumping, publishing. The author is
awesome too. Also making python3 tooling possible on python 2/3 packages. I'd
highly recommend trying it out.

Pipenv on the other hand was not only slower, but _failed_ to actually resolve
our dependencies correctly. Additionally, it was slower. And looking up the
issues, I can echo that the development team seems a bit defensive and
dismissive.

------
grumps
I rarely comment here on hackernews. I've also seen several flame wars over
pipenv. I also believe that python packaging and pinning is nothing but a
mess. Recently I started using Pipenv and suddenly I've been having horrific
issues with python to the point of me almost giving up on the language itself.
I believe the issue is a mix of Pipenv, Pip and Debian. I don't have full
scope view of the issue yet but without evidence I believe the issue is how
Debian uses Pip at the system level, and in pip 9+ an API used by Debian
changed. Pipenv seems to greedly somehow upgrade my pip and thusly hoses
everything and I find myself reinstalling everything. All three are making me
consider making changes to my entire stack and setup that I've been using
daily for the past 5 years on many many systems.

In addition when/if I have time I'll further debug and attempt at PRs and
issues to help.

~~~
depressedpanda
Why are you having issues with Debian and pip?

\- First of all, _never_ `sudo pip install` anything.

\- Second, on a fresh Debian install, run `sudo apt install python3-pip &&
pip3 install --user --upgrade` to get the latest pip while still allowing
Debian to use its old outdated version. I would actually recommend you remove
it to prevent you from accidentally using with (with `sudo apt purge
python3-pip`), unless you need it available at the system level for some
reason.

\- Add `~/.local/bin` to your $PATH.

\- If you need to deal with py2 packages for some reason, consider managing
multiple Python versions with pyenv[1].

I won't go into Pipenv because I dislike the tool.

[1]: [https://github.com/pyenv/pyenv](https://github.com/pyenv/pyenv)

------
pselbert
Our organization uses four primary languages (Ruby, Python, JS and Elixir).
The package management situation for Python is by far the weakest.

We’ve been using Pipenv, but it is atrociously slow and flawed at dependency
resolution. An alternative is extremely welcome, E.g. Poetry which was
mentioned above.

~~~
greysteil
FWIW, resolution in Python is significantly harder than for Ruby, JS or Elixir
because it requires cloning down each dependency. To really speed it up,
what's needed is a registry API that provides all of the details for
resolution. That's much harder for Python because of the legacy of setup.py,
which allows version resolution to depend on arbitrary system considerations.

~~~
wuliwong
I've never really delved deeply into package management. I've been using ruby
for a while and using rbenv has made it pretty painless for a while.

I've been using pipenv on my local machine but then switched to virtualenv in
my ec2 deployments (because all the tutorials used it).

What makes it easier for ruby? Is there just this "registry API" that has
gained enough traction that everyone uses it?

~~~
ProblemFactory
The fundamental problem with Python dependencies is that they are calculated
while executing setup.py, not declared statically.

For example, the popular scikit-learn package has the following in its
setup.py:

    
    
        if platform.python_implementation() == 'PyPy':
            SCIPY_MIN_VERSION = '1.1.0'
            NUMPY_MIN_VERSION = '1.14.0'
        else:
            SCIPY_MIN_VERSION = '0.13.3'
            NUMPY_MIN_VERSION = '1.8.2'
    

and these are used to request dependencies from the resolver/downloader. You
could have more dynamic dependencies that vary based on system libraries and
tools installed on the machine where you are running it, or even the current
weather.

Until this is replaced by static version numbers, and all popular packages
adopt it, a registry API cannot exist as it needs to run code on your machine
to figure out the dependencies.

------
Apreche
I've used virtualenv, pipenv, pyenv, venv, etc.

What the hell is the difference? All that I have used have worked pretty much
exactly the same as the others. All work just fine. I've never had a problem.
I just use virtualenv since it's the oldest. I see no more reason to switch or
try other options as long as it continues to deliver.

~~~
pmoriarty
_" What the hell is the difference?"_

One of the big selling points of pipenv is that you can pin the versions of
the packages you use and their dependencies.

None of the others do this, afaik.

~~~
coleifer
`pip freeze > requirements.txt`

~~~
chatmasta
This assumes semantic versioning and does not actually pin dependencies to the
hash of their bytes, like a lockfile does.

~~~
j88439h84
`pip-compile --generate-hashes` is the best way to manage python dependencies.

[https://github.com/jazzband/pip-tools](https://github.com/jazzband/pip-tools)
[https://gist.github.com/hynek/5e85706cee589a204251b333595853...](https://gist.github.com/hynek/5e85706cee589a204251b33359585392)

~~~
chatmasta
Could those hashes be different for the same package built on different
architectures?

~~~
hynek
Yes, that’s why there’s multiple hashes per package.

------
rch
How did this get to be such a contentious issue? While I've been focused on
other languages this year, my Python side projects have all gradually moved
towards pipenv. I still have a few locally shared virtual environments, and
setup.py is still the default for distribution. I also sometimes embed into
uwsgi, but I don't know of anyone else that bothers with that. We're using pex
at work.

Overall the article seems hyperbolic, relative to the actual events cited.
Packaging is a work in progress everywhere.

------
scrollaway
I just _feel_ this article inside my bones. The headline is 100% accurate.

What bugs me is we have some excellent examples of good patterns… in the JS
community. Pipenv promised to be Yarn and ended up way off target :/

More context from my previous post:

[https://news.ycombinator.com/item?id=18247788](https://news.ycombinator.com/item?id=18247788)

------
EdwardDiego
> Another issue with this tagline was the Python.org and official parts. The
> thing that made it “official” was a short tutorial [1] on
> packaging.python.org, which is the PyPA’s packaging user guide. Also of note
> is the Python.org domain used. It makes it sound as if Pipenv was endorsed
> by the Python core team. PyPA (Python Packaging Authority) is a separate
> organization — they are responsible for the packaging parts (including
> pypi.org, setuptools, pip, wheel, virtualenv, etc.) of Python. This made the
> endorsement misleading.

Hmm I'll admit I only started using pipenv because of that... "endorsement".

------
hyperion2010
Allow me to reproduce a comment that I made in /r/programming last week on
this topic.

I use pipenv in production and testing to simplify deployment on systems that
don't natively support python 3.6+. When it works it is great. When it fails,
or when the cli options fight each other and try to be smart but instead for a
circular firing squad it is one of the most insanity inducing pieces of
software I have ever used. Pipenv release have repeatedly broken CI builds for
me for the past 3 months. I was so pissed with how bad it was about 9 months
ago that I actually gave up trying to use it on my development machine and
learned how to write gentoo ebuilds. On reflection it seems like the perfect
tool for python -- if you stay on the happy path and only use it in BDFL
APPROVED ways then it can be great, be woe to the fool who wanders from the
light into madness.

------
mlthoughts2018
People please stop engaging with pipenv and just use conda. Even critical
arguments are mistakenly acting like pipenv deserves a seat at the table. Just
don’t engage. It’s like the creationism of Python packaging.

~~~
malcolmgreaves
I don't understand why this comment isn't higher! Honestly, using `conda`
eliminates the pain of python dependency & environment management entirely.
Clone a repo, do `conda env create` (from the project's `environment.yml`),
then `source activate $ENV_NAME`. Straightforward. Easy. But most importantly,
reliable!

~~~
m_ke
It's great but can be a pain if you develop on a mac and try to replicate your
env on linux because the env files are not portable.

The CLI can be confusing at times too, I always have to google how to create a
new env or export it.

~~~
mlthoughts2018
The exact use case I use conda for at work is developing on Mac and
automatically export the same env to various linux platforms.

Why do you suggest conda doesn’twork for that? It’s one of the things conda
specifically does. When recreating the same env on a different platform, it
will resolve the dependencies for that platform, so there are no cross-
platform issues of the underlying env unless the library being installed
simply doesn’t support that platform, in which case _no_ environment manager
could possibly solve that specific problem.

------
Townley
Curious about one criticism in particular:

    
    
      I can run pipenv shell to get a new shell which runs the activate script by default, giving you the worst of both worlds when it comes to virtualenv activation: the unwieldiness of a new shell, and the activate script, which the proponents of the shell spawning dislike.
    

In a project that just uses pip + a virtual environment, I'm used to
activating the virtual environment when I want to run a command.

Why is `pipenv shell` any worse than that? What do "proponents of shell
spawning" dislike about the activate command?

~~~
heavenlyblue
What I don't really get is this: why can't you just prepend the virtualenv to
the PATH and be good with it?

That is only required obviously if you're using bash scripts which can't be
directed to an actual python binary.

~~~
pseudalopex
"activate" prevents accidentally nesting environments. It also changes the
prompt to show which one is active. Running "deactivate" is easier than
changing the environment variables back yourself.

~~~
JNRowe
Unless you change state yourself in the mean time.

`deactivate` changes `$PATH` back to what it was when you ran `activate` for
example, which isn’t likely to be what you want if you’ve changed it since. At
least with a subshell you can know what state you’re returning to with <C-d>.

That is part of the problem with the `virtualenv` story in my eyes. It offers
the illusion of isolation, but falls down in quite a few ways which are
annoying when they do pop up.

------
bitfhacker
I never used pipenv but I use virtualenv+pip. Pip always resolved well all
dependencies for me, so... What are the advantages of pipenv over
virtualenv+pip?

~~~
tutuca
For me it was the promise to be able to maintain my dependencies up to date
and avoid manually handling configuration files.

Working in a team it is often useful to have your versions "pinned" down so
you have a reproducible environment regardless of upstream backwards
compatibility policy (just to name one example).

The Pipfile + Pifile.lock combo seemed right for the task.

Creating virtualenv was a nice plus.

~~~
Twirrim
What was wrong with requirements.txt? We heavily use virtualenvs and just pin
package versions there. I've yet to see it fail.

~~~
guitarbill
updating requirements.txt is a pain, and also development dependencies vs
normal ones usually results in two requirements files. lock files are a good
way to do this, so borrowing from npm et al isn't a bad move.

(for what it's worth, i use pipenv and rather like it.)

------
j88439h84
The best way to manage python dependencies is with pip-compile

[https://gist.github.com/hynek/5e85706cee589a204251b333595853...](https://gist.github.com/hynek/5e85706cee589a204251b33359585392)

    
    
        update-deps:
        	pip-compile --upgrade --generate-hashes --output-file requirements/main.txt requirements/main.in
        	pip-compile --upgrade --generate-hashes --output-file requirements/dev.txt requirements/dev.in
        
        init:
        	pip install --editable .
        	pip install --upgrade -r requirements/main.txt  -r requirements/dev.txt
        	rm -rf .tox
        
        update: update-deps init    
    
        .PHONY: update-deps init update

------
zaro
Tried pipenv for the first time few weeks ago. The first thing I do is 'pipenv
install celery[redis]', fairly trivial, and I hit a critical bug. The package
installs fine, but the lockfile is unusable now because of the extra. So every
time I install something, i need to run sed, to remove all extras from the
Pipfile.lock.

And then there is the speed. pipenv is pathetically slow installing packages.

And I am thinking to myself having recently read an article about how bad npm
is, "Only people who never had to deal with pythons packaging mess think npm
is horrible"

------
dnlsrl
This article has been really informative and has given me context on what the
hell is the problem with pipenv in people's opinion, which I have been
wondering about for a while, I'll check out Poetry soon. Reading all the
context, however, has reminded me that people are mainly assholes and now I
feel I need to stop using social media completely, and the Internet for that
matter.

------
foxhop
Don't forget to pin pipenv itself, each dot release breaks something

~~~
legostormtroopr
Oh, they don't do dot releases anymore, its "CalVer" [1] - aka. Calendar
Versioning, aka. release when we feel like it. And why did they do this:

> We just switched the project over to calver, with the explicit purpose of
> preventing [Kenneht Reitz] from making more than one release a day

[1] [https://calver.org/](https://calver.org/) [2]
[http://journal.kennethreitz.org/entry/r-python](http://journal.kennethreitz.org/entry/r-python)
(Ctrl-F 'calver')

~~~
toyg
Reitz has self-confessed manic episodes due to bipolar disorder:

[https://www.kennethreitz.org/essays/mentalhealtherror-an-
exc...](https://www.kennethreitz.org/essays/mentalhealtherror-an-exception-
occurred)

[https://journal.kennethreitz.org/entry/on-
mania](https://journal.kennethreitz.org/entry/on-mania)

~~~
trymas
I will be down-voted to hell, but sometimes I feel that KR hides behind the
illness to avoid responsibility. KR told that he's just "good at marketing"
and has "cult of personality" [0] and I do not see an edit that it was "KR the
maniac" speaking and not the "KR the normal guy".

Personally, today I'd better use `venv + pip` instead, of `pipenv`, but
`pipenv` somehow was the "official" tool for package management for months,
until people started discussing it [1]. I would like to have better packaging
tools in python, but `pipenv` approach seemed really strange, now I know why.

If you had manic episode and did some mistakes - go and try reverse them.
Revert/change the commits. Apologise for the comments you've made. But no,
let's get into the position of a victim, when someone criticises you. I
believe that KR may have psychological problems and/or illnesses, but his
"normal self" seems to be rather egomaniac too and takes bolder claims that
he's comfortable to handle. If it would be otherwise - there would be no
drama, there would be no fake "official" tools. KR could just enjoy his fame
from `requests` and save his time while writing responses on his blog.

[0]
[http://journal.kennethreitz.org/entry/r-python](http://journal.kennethreitz.org/entry/r-python)

[1]
[https://www.reddit.com/r/Python/comments/8jd6aq/why_is_pipen...](https://www.reddit.com/r/Python/comments/8jd6aq/why_is_pipenv_the_recommended_packaging_tool_by/dz0hbuj/)

------
neillyons
I recommend pip-tools along with pyenv and pyenv-virtualenv.

[https://github.com/jazzband/pip-tools](https://github.com/jazzband/pip-tools)

[https://github.com/pyenv/pyenv](https://github.com/pyenv/pyenv)

[https://github.com/pyenv/pyenv-virtualenv](https://github.com/pyenv/pyenv-
virtualenv)

------
ausjke
I use pipenv for the past year, there are some hiccups but overall I found it
works well for my cases. I only need store Pipfile to my git repo and after
'pipenv shell' things just worked as expected for me.

------
rectalogic
I tried pipenv, then poetry, then pip-tools [1]. pip-tools worked best for me.
I control my own virtualenvs, and can compose the requirements files pip-tools
compiles. It's vendored in pipenv, so it's basically the dependency engine for
pipenv.

[1] [https://github.com/jazzband/pip-tools](https://github.com/jazzband/pip-
tools)

------
luhn
The one thing about pipenv that prevents me from using it in more projects is
that it refuses to _not_ upgrate a dependency. If I install or upgrade a
dependency, _every_ dependency is upgraded. (I'm not sure what `--keep-
outdated` is supposed to do, because it definitely does not keep outdated
dependencies.) First of all, it's surprising and unpredictable. (Why would
`pipenv upgrade requests` upgrade boto3?) It also prevents me from controlling
change (and therefore controlling risk). I'm responsible for everything that
gets deployed, so when I upgrade a dependency it is deliberate, isolated, and
well-tested. I can't do that with pipenv.

I understand that they want to encourage keeping dependencies up-to-date, but
I think the proper approach to this is npm's, where it informs me about
outdated packages, but lets me do what I will with that information.

So, for the moment, I'm still in the dark ages of manually pinning everything
into `requirements.txt`.

~~~
greysteil
This is about to get better - check out
[https://github.com/pypa/pipenv/pull/3304](https://github.com/pypa/pipenv/pull/3304).

------
moonsun
I don't write libraries but pipenv works well for a simple flask application
because I don't ask much of it.

All I want is a simpler file to look at compared to requirements.txt

Here is my guide for newbies like me

1\. Use Pipenv in development.

2\. Create a requirements.txt each time you make any changes to your pipenv.
Commit all three files: pipenv, pipenv lock, and requirements.txt to git.

3\. Use this requirements.txt in production.

4\. ???

5\. Profit

Now here is my only gripe:

I believe pipenv is meant for simple people like me. I have to say I want to
use Python 3. There is no way to say I accept anything 3.5+

My understanding is that pipenv is not meant for people who actually know
python inside out. My use case is that it lets me keep track of what
dependencies I installed as opposed to what dependencies my dependencies
installed. This should have been the ONLY problem that pipenv fixes but like
the old saying goes... no project is complete until it is able to send mail
(sorry I probably said it wrong).

~~~
pseudalopex
It sounds like pip-tools[1] is what you want Pipenv to be.

1\. Put your dependencies in requirements.in.

2\. Run pip-compile to generate requirements.txt.

3\. Commit both files.

4\. Use requirements.txt in production.

You can also use setup.py or setup.cfg instead of requirements.in. This lets
you build packages and specify dependencies like "Python 3.5+".
requirements.in is simpler, though.

[1] [https://pypi.org/project/pip-tools/](https://pypi.org/project/pip-tools/)

~~~
austinpray
+1 for piptools. Piptools makes a lot of sense in a docker based workflow
(virtualenv not needed).

------
cellularmitosis
[https://github.com/pypa/pipenv/issues/1382](https://github.com/pypa/pipenv/issues/1382)

^ This was just such a glaringly terrible design decision, and the fact that
they won't even acknowledge it as being a mistake is really frustrating.

npm already found the best solution (if ./node_modules exists, it is
automatically used -- no need to run any sort of shell commands). All they had
to do was just copy their behavior.

Oh well, maybe the _next_ python virtual environment tool will get it right...
:(

Python is a great technology but it is really a shame that is hampered by bad
decisions (e.g. the 2->3 fiasco).

~~~
joshuamorton
The behavior you describe is great until you run `py.test myproject` and it
proceeds to invoke all of the unit tests for tensorflow or whatever.

~~~
scrollaway
That's absolutely the sort of thing that can be fixed in pytest's test
discovery algorithm. It's not like this isn't exactly the same fix as in the
js world (= ignore node_modules in test discovery).

~~~
joshuamorton
Sure but then you have a chicken/egg problem: pytest isn't the only testing
library that does test discovery (standard library unittest does too, among
tons of other 3p ones). Does everyone follow pipenv's convention? Do you write
a PEP? What's the solution (maybe a pep-like is the right one to be honest),
but its not an immediate change, and makes people less likely to use your tool
in the interim.

Kenneth used a growth hack.

------
avip
I've long freed myself from the exhausting pipenv/virtualenv/pienv/conda rat-
race by using docker. The _only_ downside is some annoying IDEs refuse to
acknowledge the existence of interpreter in a container.

~~~
holografix
can you get VSC code completion to use a python interpreter in a Docker
container?

~~~
avip
I can't! In fact that was my main motivation in posting that, maybe a wiser
HNer will enlighten me.

~~~
holografix
Damn was hoping something had changed in the last 6 months when I had a look!
:-)

------
president
Honest/naive question - why has it been so hard to create a good package
management system for Python? Couldn't someone do a straight port of one that
has been proven to work well from a different language?

~~~
cwp
The problem is that (just about) all python packages express their
dependencies with a script called setup.py. To figure out the full transitive
closure of dependencies you have to walk the graph, downloading and executing
setup.py scripts. First of all, that's slow.

But worse, setup.py is a python script with access to the full power of
python. It can crash. It can be slow. It can loop forever. It can have
dependencies of its own. It can require specific versions of python. It can
download stuff from the internet. It can do anything.

If you wanted to implement something like npm for python, you'd have to
convince all the python package maintainers to write and test package.json
files for all their packages. Even if they were willing and enthusiastic about
that, you'd have a chicken and egg problem because nobody could test a
package.json until all their dependencies had them.

Python has a such a plethora of packaging tools because people keep trying to
solve the problem by writing better code. But the real problem is lack
metadata. Python will always have a lousy packaging ecosystem because it
relies on setup.py.

------
craigds
pip-tools (pip-compile, mostly) does everything I wanted out of pipenv (a sane
lockfile) and is more explicit. I don't have any reason to use pipenv. No idea
why pip-tools isn't super popular, honestly.

------
lvh
The article doesn't mention pyenv, which I've used to great effect. It does
pretty much everything I want, including having multiple environments active
at once (so that tools that want to do their own virtualenv management, like
tox, can find a python2.7, python3.5, python3.6, python3.7, pypy2... in PATH)
and automatically activating a virtual environment when I go into a particular
directory.

For a project to be easy for me to use, it doesn't need to do anything fancy
to accommodate me: 1) 2 or 3 or both? 2) a setup.py or a requirements.txt. I
basically always `pyenv virtualenv 3.7.1 $PROJECT && pyenv local $PROJECT` and
then more or less never worry about this problem again.

When I need to _produce_ a pinned set of dependencies, it's all just pip +
virtualenv so pip freeze works just fine.

The killer feature of pipenv is allegedly pinning but this has never made
sense to me. Maybe I just don't understand it? But pyenv makes highly
segregated environments easy, so as long as each project has an internally set
of environments, everything is A-OK.

~~~
black-tea
I use pyenv on some distros and it is good, but on Gentoo there is no point
because Python is slotted which means you can install multiple versions and
have them all available on PATH. I then use virtualenvwrapper and specify the
version manually (e.g. -p python3.7). Alternatively you can always do
python3.7 -m venv ...).

The annoying thing with pip freeze is it doesn't have a concept of a "world"
file like emerge (Gentoo). With emerge when you install a package it gets
entered into your world file but the dependencies do not. That way you always
know what you actually want, rather than just incidental dependencies. I wish
pip freeze did that.

~~~
lvh
I prefer to keep locally installed Pythons for regular use everywhere, though
I guess some distros get the same effect by patching pip to make `--user` the
default (and hence pip install xyzzy goes to my home directory instead of
messing with something that belongs to the OS).

In particular, one way this has bitten me is if the OS expects a certain
Python with a certain set of dependencies to be available for systems its
responsible for. Isn't portage itself written in Python for example?

~~~
black-tea
I used to install tools using --user, but now I install all tools using my
package manager (emerge) so I don't have to worry about maintaining a ~/bin
etc. So I basically never run pip outside of a virtualenv.

Portage is written in python, yes, so you can't just change your system python
at will, but that's not a problem really.

------
luord
Meanwhile, I've recently started using the _setup.py_ file and its
_install_requires_ and _extra_required_ fields for production requirements and
dev requirements, respectively. I don't even use _requirements.txt_ as thanks
to _setup.py_ I only have to install the folder itself.

Am I weird? Am I not supposed to do that? Is it bad practice?

~~~
lalaland1125
The main issue with this is that it doesn't version lock the dependencies of
your dependencies.

------
philwelch
I've dealt with Python package management for awhile, haven't used Pipenv a
lot (as my company has their own homegrown duct-tape-pip-and-virtualenv-
together solution that they started building before Pipenv was available), and
I've noticed a few things.

The root problem is that just having a system like pip to manage your Python
dependencies is woefully insufficient, because it installs dependencies
globally, you might need different versions in different projects, and you
might be trying to link in some native code, so that behavior will change
based on what you already have installed or what your OS is. Also, to make
matters worse, you also have to manage your dependency _on Python itself_ ,
since there are multiple mutually incompatible versions.

So you can use virtualenv to handle the problem of isolating one Python
project from another, pyenv to handle the problem of different Python projects
using different versions of Python. Then you need a system like pipenv to tie
them all together. Except pipenv, itself, is in Python, so there's a
bootstrapping issue with using a tool written in Python that has a dependency
on a Python interpreter to indirectly manage your dependency on a Python
interpreter. Sometimes if you do something ridiculous like "using a Mac where
you haven't specifically installed the right version of Python via pyenv or
Homebrew yet", things will break in confusing ways.

One way around this whole mess is to put everything in a Docker container, so
you get to stipulate that everything runs on a particular Linux distro with a
particular set of dependencies from the ground up and there's no possibility
of anything else at all on your or any other machine ever polluting that
dependency chain and even if you're on a Mac it'll just install dependencies
and run code inside of a VM. The isolation you're trying to accomplish with
virtualenv or even pyenv, whether by hand or using a tool like pipenv, is
pretty much just a poor man's Docker container anyway. But this adds
complexity of its own while still punting the real work off to a tool like
Pip.

------
cyberpanther
pipenv is not perfect but it is definitely headed in the right direction. It
is a lot easier for beginners to get started with which is one of the best
things I've found.

To address one of the problems, Running Scripts: basic complaint is you are
starting a new shell always and you have to prefix all commands.

Prefixing commands sucks but you can get around it pretty easy with an alias.

alias pr="pipenv run" alias dm="pipenv run python manage.py"

now 'pipenv run python manage.py startapp foobanizer'

becomes 'dm startapp foobanizer'

or 'pr django-admin startapp foobanizer'

As for the shells. I really like this feature because it's a lot cleaner. With
activate and deactivate you are essentially mutating the current shell which
can be dangerous. Also because it loads your ENVs on every run, it's easier to
switch out ENVs.

------
DonHopkins
"free (as in freedom)"

Isn't that a circular definition, not the official Free Software Foundation
line? Is it just a sneaky way of not actually saying "free as in speech"?

~~~
flomble
It's not circular because "freedom" can't really be used to refer to price
(e.g. "freedom of beer"), so it actually disambiguates. I wouldn't personally
default to assuming a political motivation for the wording in this case.

~~~
DonHopkins
Sure it can be used to referred to price: "freedom of cost", thus "free as in
beer" -vs- "free as in speech", a common expression you may have heard before,
if you are aware of the Free Software Foundation.

[https://jamesdixon.wordpress.com/2008/06/04/what-does-
free-a...](https://jamesdixon.wordpress.com/2008/06/04/what-does-free-as-in-
beer-mean-what-does-free-as-in-speech-mean/)

It's not that it demonstrates a political motivation, just an ignorance of the
actual free software culture they're trying to associate themselves with. It's
like saying "Make America USA Again". Tautologically circular, close, but no
cigar.

------
skocznymroczny
What's the recommended Python solution for deploying Python
scripts/applications off-line? I am deploying several scripts on machines with
no access to the internet, so I can't exactly pip install -r requirements.txt.
My current solution is to cache all pip packages in a packages directory and
then export PYTHONUSERBASE=$PWD/packages, but it feels like a hack.
Virtualenvs are no solution because they aren't portable (hardcoded paths).

------
sametmax
Same experience with pipenv. One problem is that it creates the virtualenv
even if you are in sub dir. It's easy to make a mistake.

------
_pmf_
Despite disliking Clojure now, I find its (rather: Leiningen's) approach to
dependency management the best possible approach: clojure.jar (language core +
standard library) is a regular dependency like any other dependency. (Of
course this only works because the "real" core is the JVM.)

------
paultopia
I like and use pipenv, but the lockfile slowness thing is truly a nightmare.
It's especially bad in the data science stack. On my fast MBP, I can `pipenv
install numpy pandas matplotlib` and then go get lunch, and there's a fairly
good chance it won't be done when I get back.

------
yellowbuilding
_“Not as good as pip, but it’s more reasonable than Pipenv. Also, the codebase
and its layout are rather convoluted. Poetry produces packages instead of just
managing dependencies, so it’s generally more useful than Pipenv.”_

Can anybody confirm which codebase is being described as convoluted here?

------
dec0dedab0de
I really like pipenv, but I understand these complaints, especially the time
calculating the hashes for the locks. Why is it so much slower than npm!??!
But pipenv sync, and pipenv --venv Have made my deployments much easier.

------
Bogdanp
It's unfortunate that more people don't know about pip-tools (it isn't even
mentioned in the OP). It + virtualenv hits a sweet spot for speed, usability
and utility for me.

------
esoterae
Stop trying to make pipenv happen; it's not going to happen.

------
awake
What pipenv needs more than anything is a proper back tracking dependency
resolver. With that it would instantly become my package installer of choice
for python.

------
guyskk
application and library are very different, it's unrealistic to have one tool
solve 99% library problems. If you disagree, please check:

1\. Can it support different python version and virtual environment?

2\. Can it support build packages for windows,osx,ubuntu,centos...?

3\. Can it support different build tools, for example: wheel, cython...?

4\. Can it manage dynamic depdency version?

5\. Can it manage version, like bumpversion?

------
gjvc
What does pipenv deliver that cannot be done with pip and virtualenv?

------
INTPenis
Skimming the comments in this thread I see firstly a lot of hate towards
pipenv that I don't understand. I mean on a technical level, I don't
understand what you're talking about.

Secondly, interspersed beteween the hate I could count at least 6 alternatives
to pipenv.

So as a python user I don't know who to trust.

I stumbled across pipenv at a time when I was managing a "global" directory of
virtualenvs instead of putting them inside each project dir.

So I could do source ~/.venvs/project/bin/activate because I was tired of
having different .venvs inside project dirs.

Pipenv seemed like a welcome change and I especially liked doing pipenv run
commands. I still run my dev servers with pipenv run. Keeps my environment
clean.

Sourcing a virtualenv before would lock that shell to only working with one
project, one environment.

I have noted some issues with pipenv, the first was in ansible deployment but
it wasn't a show stopper.

The second issue I can't remember so it must have been fleeting.

As someone else pointed out, maybe it's for simple users like me who don't
need to understand the advanced internals of Python.

Edit: Right! The 2nd issue was actually that I work with some services that
are spread across three different git repos/projects. Maintaining a central
virtualenv for multiple projects needed figuring out but I think I've resolved
it by having a parent dir for the service where the virtualenv is created and
then pipenv commands in the sub-dirs (git repos) use the parent virtualenv.

I've also noted some confusion in pipenv on whether it's using python3 or 2. A
pipenv --two project might try to install packages using pip3 for some reason.

Lastly, a lot of the hate towards pipenv seems to be directed towards how it
was launched and marketed. Which in my opinion has little to do with the tool
and how it might help users with Python.

Open source has always been and will always be a wild ecosystem where the best
tool floats to the top by word of mouth alone. So why be mad because someone
used python.org to promote a tool when you can contribute your time and
knowledge to promoting your favorite yourself.

I just don't like when there are too many options to choose from and I don't
know which one is right for me. I guess time will tell. Also I don't really do
CI/CD pipelines yet so maybe a lot of issues are unknown to me.

------
sigmonsays
This article is just a reminder that it may be impossible to fix the nightmare
that is pythons packaging system. I'll save the python rant but this kind of
software just further fragments and damages pythons reputation.

~~~
erikb
Honestly it's a nightmare in every language. And I found in comparison to most
other languages I've worked with (Java, Ruby, PHP, Golang) I actually feel
it's done quite well.

It's a super complex topic, it needs leadership support so that you can get a
reasonable percentage of the community to drill down on one solution instead
of having 4-5 competing ones, but it also needs a lot of in-depth insights,
which is kind of opposite to leadership which requires a top-down view.

~~~
strzibny
I always felt like RubyGems is mostly done right. And not just packaging
itself, it's also easy to host your own gems and combine sources with upstream
for example. What is that pipenv made better than RubyGems for you?

