It's actually not, it's part of setuptools/distribute, though some Python distributions (actually just brew that I know of) include distribute alongside Python.
Also, while the quick skim of the rest of this looks mostly good, there's some unnecessary advice which complicates things.
virtualenv is not necessary if the crux of the advice is "don't install packages globally" (which is fantastic advice, second probably to the more important advice "don't install stuff with sudo").
What beginners (and even some more experienced programmers) need to learn about is --user. You install packages with `pip install --user thing`, and they become per-user installed (be sure to add `~/.local/bin` to your `$PATH`). This is enough to not require sudo and for 99% of cases is actually sufficient.
There is a rare case where you actually have packages that depend on different, conflicting versions of another dependency. This has happened ~1 time to me.
Don't get me wrong, I like virtualenv for other reasons (relating to workflow and maintenance), but if we're teaching people about packaging, there's no particularly great reason to dedicate most of a tutorial on it.
Is this relevant advice for packages needed by your web server account, e.g. www-data?
I've eagerly read pieces like this but haven't yet found out the reason this solution is problematic or that I'm doing it wrong. Just that no one else seems to be recommending it. Anyone have an idea?
Btw, one of the best discussions of the various deployment options I've seen is from the Pylons book: http://pylonsbook.com/en/1.1/deployment.html#choosing-or-set...
One other thing I'm not happy about regarding packaging best-practices (and PyPi) is that security updates are not able to be automated leading to vulnerable packages.
Go is looking more attractive day by day.....
"Well, stop doing it then..."
Jests aside - Go is appealing for many reasons but your self-inflicted pain is not necessarily one of them. ;-)
Uninstall everything and start over?
pip freeze > pip_list_with_sudo.txt
sudo pip uninstall -r pip_list_with_sudo.txt
Lots of benefits, but trading 'cd path/to/my/project && source env/bin/activate' for 'workon project_env' (with autocomplete) is alone easily worth the five seconds it takes to check it out.
And/or use the virtualenvwrapper plug-in for oh-my-zsh. Automatically activates virtualenv when you cd to the working directory.
Either way, nice article. Now if only most packages weren't still for Python 2... PyPI says it has 30099 packages total, but only around 2104 of them are for Python 3 (according to the number of entries on the "Python 3 Packages"-page).
Edit: perhaps you changed your .npmrc or set the option via npm and forgot about it? I just checked on a fresh user, and 'npm install -g' definitely tries to install to /usr, just like pip.
No, why would I do something like that on a development box? (Or run web stuff on Windows servers for that matter.) And pretty much the only things I install with -g are useful CLI tools - any code I write will have its dependencies installed locally and listed in package.json for 'npm install'.
As for why you would run stuff on windows, perhaps you were writing an ajax gateway to a legacy system and it made more sense to run the node server on the same machine as the legacy system?
(To be clear, I would pity you if that was the case, but you never know ;-)
The primary use case of npm is quite different. No one installs system-wide npm packages.
Virtualenv solves a different problem (create a complete Python environment inside a directory) so you can replicate the various production setups in your machine and develop. It's not a way to avoid admin-privileges to install system software, for that you can just pip install --user, use homebrew, whatever.
Based on the article I'd say the main reason to use it is so that you can have what amounts to local packages instead of having to rely on global packages (be they system-wide or user-specific). This is what npm does by default - packages are installed locally to node_modules.
And for replicating production setups I'd rather take it a step further and use something like Vagrant instead of replicating just one part of the setup (Python).
pip install --user XXX
Next one is happening next Thursday and there are still a couple of tickets:
Unless you are using Windows, as pip doesn't support binary packages.
In general what are the best practices for using virtualenv with version control?
Generally you should strongly avoid putting generated artefacts into version control. This leads to complete pain if ever you find yourself trying to diff or merge when they inevitably change. The problem is that you end up with conflicts which are completely unnecessary - you should always be able to just regenerate the virtualenv at any time.
This is especially true for non-relocatable artefacts (as others have mentioned) such as virtualenvs or compiled binaries.
Another thing is that these generated artefacts can be costly in terms of space consumed in the repository - maybe not so much for a virtualenv with one package in it, but for binaries or larger virtualenvs, these things can become quite large. In addition they're often not so friendly for git's delta compression which is better suited for textual data. You can end up unnecessarily increasing the size of your repository significantly, which is another thing best avoided.
I keep my virtualenvs in ~/.virtualenvs/, away from the project.
If your project is a package itself (i.e. it has a setup.py file), then use that file to specify dependencies. On a new machine I check out a copy, create a virtual env and activate it. Then in the local copy I run "pip install -e .". This installs all the requirements from setup.py in the virtualenv, and links the local copy of my project to it as well. Now your package is available in the virtual env, but fully editable.
If your python project is not a package, you can install its dependencies in a virtual env with pip. Then run "pip freeze" to generate a list of all installed packages. Save that to a text file in your repository, e.g. ``requirements.txt``. On a different machine, or a fresh venv, you can then do "pip install -r requirements.txt" to set everything up in one go.
Document it in setup.py:
if sys.version_info < (2, 6, 0):
sys.stderr.write("Foo requires Python 2.6 or newer.\n")
Just add the env directory to your .gitignore
Also, IIRC the environment will contain a symlimk to the python executable and companion files. That symlink will change depending on the environment.
When you create the virtualenv, the current package you're working on doesn't get added to site-packages, so you're forced to be at the repository root to import the package.
The best approach is to have a proper setup.py file so you can do `python setup.py develop`, which will link the package you're working on into the virtualenv site-packages. This way it acts as it's installed and you can import anyway you like.
If you define your requirements on the setup.py (I think you should), you can even skip the `pip install -r requirements.txt` step.
I've cooked up a package template that can help getting this working:
How do you verify package integrity? Do you simply pray that PyPI isn't compromised at the moment, or do you download your packages from Github instead, because the main repositories have more eyeballs on them?
How do you do security updates with pip?
I'm using apt-get at the moment which gives me security updates AFAIK, but my need is growing for more recent versions and certain packages that aren't accessible with apt.
You might also like to check out wheel, which allows you to compile signed binary distributions that you can install using pip.
Python actually has another, more primitive, package manager called
easy_install, which is installed automatically when you install Python itself.
pip is vastly superior to easy_install for lots of reasons, and so should
generally be used instead. You can use easy_install to install pip as follows:
Which covers a lot of the same ground, but with buildout.
(1) specifically state that `pip freeze` is how to create requirements file (as folks have said in comments already)
(2) add "further reading" link on VirtualEnvWrapper, as it adds some convenience methods to ease use of VirtualEnv
(3) the "yolk" package shows you what's currently installed; it can be helpful to `pip install yolk` then run `yolk -l` to view the different packages installed in each of your envs.
(4) when installing a package, you can specify the version, e.g. `pip install mod==1.1`, whereas `pip install mod` will simply install the latest
In the real world, a typical pip requirements.txt file will have a mix of package names (which pip looks for and downloads from an index like pypi), git repo urls (eggs installed directly from a git server, eg from github) and bleeding edge track the latest changes -e git-repo#egg=eggname urls. That you can switch between these with ease is important, eg to switch to using your fork of some package rather than the last official release.
I'd never have believed a day would come where something like Homebrew would ever gain the traction it has.
Think that virtualenv is way to package a complete Python environment together with the package you need or are working on.
* bash command line
* or daemon manager like supervisord
Do you just hand the user an automatic virtualenv script? (Outside of using one of the binary builders out there, obviously.)
However, I was very surprised that the author didn't mention venv (http://docs.python.org/3.3/library/venv.html) at all since it is basically virtualenv but part of the Python standard library.
That said, dependency hell is always tricky, and I've had to deal with some far uglier solutions in other platforms.
you might call it a workaround because its not the most elegant solution possible, but it still just works really well, so why complain?
dealing with dependencies is always step 2 for me when learning a new language and dependency hell seems like a universal problem. I could be wrong though.
Do I package virtualenvs in an RPM or DEB?
Unix fragmentation of where the bin and lib directories should reside, i.e. /bin, /usr/bin, /usr/local/bin, ~/bin, ...
Windows doesn't have symlinks and the different packaging tools have tried to implement the functionality in various different ways.
Python doesn't add the path of the "main" executed file to the module lookup path. (edit: actually, I think this is wrong. I meant to say "Python module import lookup is complicated.")