
Use pew, not virtualenvwrapper, for Python virtualenvs - pmoriarty
http://planspace.org/20150120-use_pew_not_virtualenvwrapper_for_python_virtualenvs/
======
jackschultz
Wow, all these comments seem to be about all the different environments to use
-- new suggestions, and talking about what makes others not the best.

For those just starting Python, I'd say just go with virtualenv. It works,
it's common, and there are tons of tutorials about it. Don't feel bad about
using the old mediocre libraries. And in general, learning basics isn't
embarrassing.

------
abhirag
The discussion here has quickly gotten out of hand and at the risk of causing
more confusion I would like to point out that
pipenv([http://docs.pipenv.org/en/latest/](http://docs.pipenv.org/en/latest/))
seems to be the officially recommended Python packaging tool now[1], though
venv and pip work too.

[1] -- ([https://packaging.python.org/new-tutorials/installing-and-
us...](https://packaging.python.org/new-tutorials/installing-and-using-
packages/))

~~~
smegel
We are talking about packaging tools now?

~~~
abhirag
Kindly don't shoot the messenger(i.e. me) but pipenv does the work of pip +
virtualenv + requirements.txt so packaging tool seems like a good name for it.
Virtual environments aren't a lot of fun without local packages and
reproducibility I assume.

~~~
wink
Last time I tried pipenv (a month ago?) there was a breaking unresolved bug on
Ubuntu 16.04.

I'm not easy to hold grudges, but such a critical low-level tool manages to
break for what I see as one of the most likely server environments to run
on... I hold off on converting the (annoying, but working) virtualenv(wrapper)
stuff. Just not worth it, now that I spent weeks of my life handling the
annoyance that is python packaging and deployment.

------
sillysaurus3
What I really want is to be able to type "<magic> foo.py", where <magic>
creates a list of all necessary dependencies to run foo.py. Then "<magic> run
<that-list> foo.py" would automatically do whatever voodoo needs to be done to
run foo.py regardless of where you run that command.

The idea is that we all have global pip installations, which is great for
hacking. I have a ton of Python scripts that depend on my pip installs. All my
scripts run successfully on my box. The goal is to be able to send you one of
my scripts along with <that-list>, then you can run "<magic> run <that-list>
foo.py" on your box, and everything "just works."

That way there is literally zero configuration. I don't have to tell any tool
anything, and it's effortless for you to run it with all the benefits of a
virtualenv.

~~~
rhaps0dy
Sounds almost like `pip freeze > requirements.txt`. This creates a list of all
your currently-installed packages, which you can install with `pip install -r
requirements.txt`.

The biggest problem with this though is that this will make the other person
install _all_ your installed packages, rather than just the ones needed for
the particular script.

------
doozy
The preferred environment for Python development and deployment changes more
than the latest trend in JavaScript frameworks.

~~~
gjjrfcbugxbhf
Nope it's been virtualenv for at least 3 years.

~~~
lgunsch
I've used it continuously for 6 years.

~~~
gjjrfcbugxbhf
Yeah I've done so for three - I'm happy enough with it and confident enough to
recommend it that I didn't bother to look up how long it's been around for..

------
mmerickel
I'll take a chance to plug `vrun`
[https://pypi.org/project/vrun/](https://pypi.org/project/vrun/) here as well
which is a very lightweight wrapper around activate that only applies to the
current command. You can use it to start a subshell if you wish.

The fundamental problem is that virtualenv scripts are not actually bound to
their actual virtualenv. If you forget to activate and a script shells out in
a subprocess to execute another script it will not find the version installed
into the virtualenv. This should be fixed upstream but so far it doesn't look
like it is.

    
    
        $ python3 -m venv env
        $ env/bin/pip install vrun
        $ env/bin/pip install foo # assume foo depends on bar and tries to start bar via subprocess
        $ env/bin/foo # fails because bar is not on the path
        $ env/bin/vrun foo  # works
    

`foo` is now executed in a process in which the virtualenv is fully activated
and will find appropriate dependencies on the path if it tries to run a
subprocess, etc.

~~~
ofek
When you get a chance, try out Hatch and its `use` command i.e. `hatch use
myenv pip install ...`

[https://github.com/ofek/hatch/blob/master/COMMANDS.rst#use](https://github.com/ofek/hatch/blob/master/COMMANDS.rst#use)

------
scaryclam
Why not just plain virtualenv?

~~~
mason55
There's a link in the article which explains why

[https://gist.github.com/datagrok/2199506](https://gist.github.com/datagrok/2199506)

~~~
bdarnell
You can use virtualenv without `bin/activate`. Just refer to `bin/python` (and
other scripts in `bin/`) explicitly. Don't bring in the magic until you need
it. And when you need it, consider what kind of magic you need. If you're
typing `bin/` too much, do you need something that adds `bin/` to your path,
or do you need to write a script for this task that saves you from having to
type both `bin/` and a bunch of other command-line flags?

(I do add my virtualenv's `bin/` directory to my path, but mainly so my editor
can find the right tools, so I do it from elisp instead of bin/activate in a
shell)

~~~
mason55
How do you manage dependencies? One of the best parts about virtualenv (to me
at least) is that I can do pip install without infecting the rest of the
system.

I guess I've never looked, does virtualenv provides a bin/pip which already
knows the path of the virtualenv? Does the bin/python in there come with a
path that uses the project pip install?

~~~
lgunsch
There is a bin/pip you could call directly and have it install packages to the
correct path. Even commands installed from packages work that way.

I would only use 'bin/python' directly from scripts called from cron, or other
similar places.

------
rye-neat
As someone who is finally getting into Python development (via Django) after
following tons of tutorials over several years: should I be using pew,
virtualenvwrapper, or something else? This article is from 2015, how relevant
is it today?

~~~
lgunsch
Most people I've met use virtualenvwrapper. It doesn't work on other shells
like fish though. However, I'd simply focus on learning how virtualenv works,
and use whatever tool to manage it as you feel like.

~~~
BerislavLopac
Virtualfish is amazing:
[https://github.com/adambrenecki/virtualfish](https://github.com/adambrenecki/virtualfish)

~~~
lgunsch
Yeah, it works really well. That is what I use for my fish shell.

------
yagyu
I still use vex and am very happy with that

[https://pypi.python.org/pypi/vex](https://pypi.python.org/pypi/vex)

[https://news.ycombinator.com/item?id=7987259](https://news.ycombinator.com/item?id=7987259)

------
shtsh
Why not use pyenv + pyenv-virtualenv ?

[https://github.com/pyenv/pyenv](https://github.com/pyenv/pyenv)

[https://github.com/pyenv/pyenv-virtualenv](https://github.com/pyenv/pyenv-
virtualenv)

~~~
geoelectric
I personally found that the way pyenv conflated python versions and
environments when using this produced confusing behavior around pyenv local
settings and other features. At the time I tried it, there was also some
opinionated behavior going on regarding shell prompt modification, etc., that
caused it not to inform my prompt theming correctly.

Ultimately, I decided I didn't really want pyenv determining how I used
virtualenv, I just wanted virtualenv to work well in pyenv.

The opinionated dynamic may have changed since then, but for me the happy
medium has been pyenv-virtualenvwrapper. Instead of pyenv trying to take
responsibility for my virtualenv behavior, it just takes responsibility for
implicitly enabling virtualenv+wrapper on any given python version on first
use. That's the right split for me.

------
ris
Use neither. Use Nix.

------
ofek
I'd say [https://github.com/ofek/hatch](https://github.com/ofek/hatch) is the
more robust (and easier) choice for virtual env management.

FD: I'm the author

------
sivvy
Why not use conda?

~~~
jdiez17
Because conda (and by extension, Anaconda) harms the Python ecosystem with
their non-standard package format. If Continuum Analytics cared about making
scientific packages like numpy and scipy more accessible, they would make
binary wheels for Linux/Windows/OSX with the MKL. There is no technical reason
that prevents them from doing this. Instead they lock you in to their
package/environment manager and create confusion when you have pip and conda
packages installed at the same time.

~~~
RayDonnelly
Conda is a package manager for more than Python. It manages libraries that are
shared between Python and R and other tools. This isn't possible with a purely
Python-based solution.

It also does envs better than most other solutions since it will prefer
hardlinks which allows environmental isolation at a very small cost in terms
of disc space.

Full disclosure: I work for Anaconda Inc. where I am trying my best to make
package management and scientific computing easier.

~~~
jdiez17
Shared libraries were invented at a time when disk space was very scarce. In
my opinion, the fact that we still use dynamic linking today is largely an
accident of history. Literally the only advantage over static linking (or
distributing any shared libraries along with your application) is that you
save some space, at the cost of requiring a complex package management system.
Also, a sufficiently smart filesystem can deduplicate this transparently.

And of course, it makes distribution of any application that uses shared
libraries a whole lot more difficult.

Disk space is the cheapest and most abundant computing resource in 2017. If
you want to make these packages more widely available, just create
wheels/whatever the R equivalent is. Good, well understood, and interoperable
tooling for these already exists.

~~~
kalefranz
Dynamic linking is hugely helpful when, for example, you want to update to the
latest openssl without updating half the binaries on your system. That
packages are statically including openssl in wheels, and then wheel versions
being explicitly pinned in projects, is introducing some juicy attack vectors.

------
ythn
Why not docker? One docker image = one virtualenv

~~~
fujiters
This is my thought too. Bonus: it's like virtualenv for _every_
language/platform.

