This lets me both segment environments by name (aka by project), but also to have distinct versions of these environments if I need to use different python versions for whatever reason.
This is _entirely_ separate from the source code of my projects, which might live somewhere like the following:
This has the effect of always using that virtual environment's python version, even if I have not "activated" the environment or messed with my path in any way.
I thought this was pretty common knowledge. I have a couple 'pet' virtual environments that I use that are not particular to what I'm working on, and I just use whichever one I want as necessary. Either by direct calling the python binary in them, or (more often) just by activating the one I want.
> As long as you run your program using the virtual environment's Python, it gets to use all the modules you installed in the venv. It doesn't matter where the program is and you don't have to move it from its current location, you just have to change what 'python' it uses.
This is common knowledge. People just need to understand that by activating the virtualenv, your path precedence starts with the virtual environment. You can store your environments anywhere you want. You could even have a dev folder with a stack of all the usable virtualenv configuration which you activate as necessary.
The same holds for conda. Although with a small caveat - any pip installation that you do with conda by default may end up in the global pip cache. Virtual environment however create their separate isolated pip caches, and hence pretty useful.
tldr : virtualenv work anywhere and have a completely isolated set of python packages. Same doesn't hold for conda (unless you're doing conda install instead of pip install).
I think it isn't quite common knowledge that you don't "need" to active a virtual environment to use it. For years, I thought that activating a virtualenv was changing the PATH (so I would get the right binaries in my shell) and another environment variable like PYTHONHOME/PYTHONPATH (so that python would know where to load modules from). It turns out it _only_ sets PATH, which is non-obvious.
As far as I can tell, it turns out that there's some additional magic in the python binary so that it "knows" when it's executed from a symlink to try and add ../lib/pythonX.X/site-packages/ to sys.path, but for the life of me I can't find where this is documented how exactly this works.
PEP 405 [1], which added virtual environment support to the Python standard library, documents how the python binary attempts to determine sys.prefix, which is the key thing that has to be set to the virtual environment directory in a virtual environment. See the "Specification" section.
So, does this mean my understanding of "activate" setting PYTHONHOME (and thus being required for using a virtualenv) was true before this PEP was accepted? Looks like python3.3?
> does this mean my understanding of "activate" setting PYTHONHOME (and thus being required for using a virtualenv) was true before this PEP was accepted?
I believe that's what the activate scripts for third-party virtualenv tools did, yes.
> Looks like python3.3?
Yes, that's the version for which the PEP was accepted.
My convention for about the last ten years has been that every Python project I work on gets a `ve` directory where I install the virtualenv. Then `./ve/bin/pip install -r requirements.txt` and `./ve/bin/python myscript` are just kind of hard-wired into my muscle memory. I do a lot of Django and on Django projects almost everything is done via `manage.py`, so I just change the shebang line in it to `#!ve/bin/python` and then I can do `./manage.py whatever` and it always works. I never have to remember to "activate" virtualenvs and I never accidently run a script with the wrong virtualenv activated. When I pair with other Python developers, I'm always amazed at how much time and focus they have to put into keeping track of that stuff and how much confusion it causes.
This comment made me wonder if there's a tool out there akin to ~~autoenv~~ direnv [1], but that automatically activates a virtual environment when you cd into a directory with e.g., a poetry.lock.
Surprisingly, it seems like the answer today is no. But there's a little bash snippet that wraps cd to give us a start [2]. But just dropping `poetry shell` in an `.envrc` for direnv is better than nothing too.
My virtual environments all live in a directory structure akin to
This lets me both segment environments by name (aka by project), but also to have distinct versions of these environments if I need to use different python versions for whatever reason.This is _entirely_ separate from the source code of my projects, which might live somewhere like the following:
Another fun trick is using python entry points (https://packaging.python.org/en/latest/specifications/entry-...) which wrap the actual execution of a python script with something akin to This has the effect of always using that virtual environment's python version, even if I have not "activated" the environment or messed with my path in any way.