
My Python Development Environment, 2018 Edition - craigkerstiens
https://jacobian.org/writing/python-environment-2018/
======
nickjj
I'm a big fan of using Docker because for real world web app development, your
app is often more than just getting Python and a virtualenv set up.

Earlier this week I wrote about the pains of setting up a Python development
experience without Docker, and then compared it to Docker as well.

If anyone is curious, that's located at
[https://nickjanetakis.com/blog/setting-up-a-python-
developme...](https://nickjanetakis.com/blog/setting-up-a-python-development-
environment-with-and-without-docker).

By the way, I would say Docker is anything but slow. I get near instant
development feedback on my Flask applications, even when running things
through Docker for Windows / WSL.

These are pretty big Flask apps too, which have thousands of lines of code,
dozens of packages, tons of assets and require running Celery, Postgres,
Redis, etc..

~~~
Spiritus
My problem with using Docker (only) is that it doesn't translate well to
editors. Like, using jedi-vim[1] with a virtualenv constructed by a Docker
container doesn't work at all. Unless I actually run vim itself inside said
container.

So unless your dependences build on macOS (like in my case), everything goes
out the door.

[1] [https://github.com/davidhalter/jedi-
vim](https://github.com/davidhalter/jedi-vim)

~~~
_drFaust
We've got about 30+ backends in python all wrapped in docker containers.
Majority of the team was pure vim before I joined and they're slowly
converting to pycharm after seeing how nicely you can setup a remote
interpreter against a docker container. And it has vim bindings so you don't
have to re-learn new hotkeys.

I've also been following this VS Code issue on adding remote docker support
for python [https://github.com/Microsoft/vscode-
python/issues/79#issueco...](https://github.com/Microsoft/vscode-
python/issues/79#issuecomment-366000230)

However, if you're a hardcore vim guy then I doubt these IDEs are gonna
satiate your current flow.

~~~
wirrbel
I remember, on my laptop with pycharm and docker (and the virtual machine
docker lived in), the RAM usage was just excessive. I am by no means a
minimalist, but data-science stuff was barely impossible with 16 GB RAM.

Also, I strongly dislike PyCharm. I am a vim guy by heart, but I am generally
not against IDEs. VS Code is okayish. For C++ development, I really loved
Visual Studio. But PyCharm just feels wrong, bloated, slow and baroque

------
MarcScott
Since switching to nixos, my Python development environment couldn't be more
satisfying.

I use a default.nix file and a requirements.txt file and then with a single
command I'm into a shell and virtual environment with all dependencies and
packages installed, that I can easily transfer between machines.

That is unless I want to use PyQt5.

~~~
nextos
Nix is one of the most amazing things created lately. IMHO, it doesn't get the
attention it deserves as it provides great solutions to really tough problems
and it's ready for prime time.

A purely functional package manager, distro, devops. And pretty soon, home
directory management. Maintaining servers or doing aggressive changes becomes
very easy. There's even a Darwin (macOS) implementation now, so you can manage
most of your Mac functionally (with heavy usage of defaults under the hood).

It's still lacking a bit on usability, as some things are not as intuitive as
in a simple imperative distribution such as Arch or Alpine. Especially if you
need to run some prepackaged software that assumes FHS and binds dynamically
to some pre-existing dependencies.

But to me, right now if you are a moderately advanced user I don't see a point
in running distros that are stuck in the middle (imperative, complex, lots of
defaults). It's either one extreme (simple and imperative, e.g. Arch) or the
other (functional, NixOS).

~~~
amasad
Excuse me if this question is too basic but how does a purely functional
package manager work with a side-effectful package installation?

In Python (or basically any other language package manager) you can run
arbitrary scripts (post-install etc) and so this doesn't lend itself nicely to
a reproducible, functional approach.

~~~
nextos
A quick and overly simplistic explanation is that all inputs (source code,
package dependencies or post-install scripts) used to build a package are
employed to compute a unique hash.

Then the result of building a particular package is installed in
/nix/store/hash-packagename. And this package links to other packages in the
nix store using precise hashes. There is no dynamic linking. So the result is
referentially transparent. A particular hash is guaranteed to correspond to
the same package version, built in the same way and linked to the same
dependencies. Furthermore, installing new package versions or modified
versions of a package wont overwrite old ones, as hashes are different.

The same concept applies to a whole system setup, which is identified by a
hash computed using all options that configure the system plus all packages
available in your environment.

------
lotyrin
I want a GUI for polyglot local development for cloud-native apps with a
variety of backing services that's as braindead and easy as *AMP apps were in
the bad old days when you'd be slinging PHP, (plaintext) FTPing up to a shared
host, working with guys for whom "server stuff" was pulling teeth and only
having one version of Apache, PHP and MySQL to worry about (old terrible
ones).

This seems within reach finally due to where were are getting with containers
and orchestration. A bloated electron app plus kube in a VM maybe? Even that
much RAM is still cheaper than my time.

Dropdowns for picking out what runtime, what language, what database, what
cache, etc. picking where my code lives, then it boots everything up, solves a
local hosts entry + ssl cert and keeps my code synced.

I should be able to huck my macbook in a wood chipper to protect my private
keys from terrorists, unbox a new one, install my Jetbrains Toolbox, install
this thing, and be back up and running fixing responsive text wrapping "bugs"
on my marketing landing pages in minutes.

I should be able to hand my git repo to a potato whose wish to become human
was granted by a fairy yesterday owing to their exemplary potato-like behavior
and expect they can get the thing booted up and start blowing up my test suite
and arguing with me about indentation in minutes.

I should be able to receive news of a cool new framework in a cool new
language with a cool new runtime and a cool new database written by angels,
etched into crystal tablets discovered in the martian polar ice accompanied
with proofs that they are both feature complete and error-free then get going
on using these gifts bestowed unto mankind in a brand new repo to write a
microservice for filling people's inboxes with unsolicited promotions for male
enhancement supplements in minutes.

------
natch
I’m using Anaconda because it was recommended in a step by step tutorial for
playing with deep learning.

What would be involved in removing it from my system and moving instead to
this set of tools?

Not necessarily looking for s step by step answer, just for general
suggestions.

My guess is: find out which python the deep learning tools are using, remove
Anaconda, and reinstall the python version needed, using the tools from this
post. I’ll need to read up on the tools too. Any pitfalls with this approach?

~~~
james_the_cat
Anaconda does most of the stuff mentioned, and also makes it much easier to
install packages based on C/C++ libraries (which most deep learning things
are). So you're better off staying with anaconda. It's widely used in
commercial data science projects so the idea that noone "takes it seriously"
as someone else suggests is a bit silly. I assume they're thinking about a
different context to data science projects.

That said anaconda does have a whole variety of extremely annoying quirks,
like packages not being backwards compatible with old versions of conda, or
conda going crazy and reinstalling itself, or the way the conda-forge repo has
far more packages than the official conda repo. It's very far from perfect.
But for data science I think it's basically the standard package manager in
Python land.

~~~
Rotareti
_> ... and also makes it much easier to install packages based on C/C++
libraries_

I hear this often, though I cannot remember ever running into a pip package
where this was an issue. Out of curiosity, could someone point me to a pip
package and its conda equivalent where this is the case?

~~~
_coveredInBees
Try pip installing scipy or Numpy and you'll see the value of conda.

~~~
wirrbel
As if that still was a problem, now that we have wheels, and especially
manylinux wheels.

~~~
pwang
See Myth #6 here: [http://jakevdp.github.io/blog/2016/08/25/conda-myths-and-
mis...](http://jakevdp.github.io/blog/2016/08/25/conda-myths-and-
misconceptions/#Myth-#6:-Now-that-pip-uses-wheels,-conda-is-no-longer-
necessary)

~~~
wirrbel
Valid points. Still most of them are not issues in my daily work.

And by the way Peter, thank you for the amazing work you do.

------
foobarandgrill
>Why? pipenv handles dependency- and virtual-environment-management in a way
that’s very intuitive (to me), and fits perfectly with my desired workflow.

Why specifically do you use it instead of virtualenv (+virtualenvwrapper)?

~~~
Rotareti
Pipenv combines package management and virtualenv managment in one tool. You
can create a new project as simple as this:

    
    
        $ mkdir myproj
        $ cd myproj
        $ pipenv --python 3.6         # This creates a virtualenv with Python 3.6 for you project.
        $ pipenv install flask        # This installs flask in your virtualenv.
        $ pipenv run flask            # This runs flask in your virtualenv.
        $ pipenv run python           # This runs a REPL with the interpreter of your venv.
        $ pipenv shell                # This opens a shell in your venv.
    

Pipenv replaces _requirements.txt_ with _Pipfile_ and _Pipfile.lock_. (Similar
to what you get from _npm_ or _yarn_ in the JS world, and _cargo_ in Rust)

If you use pipenv for a project, you no longer need to care about _pip_ ,
_requirements.txt_ or _virtualenv_.

Pipenv/Pipfile is the new standard recommended by Python.org/PyPA:

[https://packaging.python.org/tutorials/managing-
dependencies...](https://packaging.python.org/tutorials/managing-
dependencies/)

[https://github.com/pypa/pipenv](https://github.com/pypa/pipenv)

[https://github.com/pypa/pipfile](https://github.com/pypa/pipfile)

~~~
wirrbel
let me add, that if you do a virtualenv-based workflow, you will end up having
some kind of a requirements list (requirements.txt), then you will realize
that you want to also version the configuration of that virtualenv with all
dependencies resolved (think `pip freeze`). You'll start to sort development
dependencies like pytest from production requirements like `six`, by this time
you will have written a few scripts to deal with this stuff.

This is where pipenv delivers. It is a destillation of best practices for
virtualenv-configuration.

In your overview, I'd just add an example for a dev installation

    
    
        pipenv install --dev pytest

------
pjmlp
I just have latest Python installed and the nice support on Visual Studio.

[https://www.visualstudio.com/vs/python/](https://www.visualstudio.com/vs/python/)

~~~
amenod
This, me too.

I never understood the need for virtualenv and similar. Do people really
encounter trouble with conflicting packages that often? I try to write scripts
so they run on different versions of python anyway, unless there is a very
specific reason why that is not possible; and even then you can run python
versions in parallel on a Debian/Ubuntu box, with different pip installs for
each of them.

As for production, I usually ship things in a Docker container anyway, so
there is no chance of mismatched libraries.

I guess I just never saw a problem that virtualenv solves.

~~~
falcolas
I've found that the library dependency issues appear in direct proportion to
the number 3rd party libraries you use. If you're mostly developing against
the standard library with a handful other stable libraries, it's not an issue.
If you're developing with a requirements.txt (or pipfile) with 10 or more
entries, you'll start running into conflicts on a regular basis.

It can also occur as projects age, especially with 3rd party libraries who
don't provide API backwards compatibility (which I fully acknowledge is a PITA
to develop for, more effort than is frequently justifiable).

Both of these are why I prefer to use the standard library when possible. It's
going to remain API stable for a very long time, and the occasional 2-3 lines
of boilerplate to do HTTPS requests (and other similar convenience functions)
is a cost I'm willing to bear for that stability.

~~~
speedplane
I think pretty much everyone would "prefer to use the standard library where
possible". But try building a modern web app with that. No one uses libraries
because they want to, they use b/c they need to.

~~~
falcolas
> No one uses libraries because they want to, they use b/c they need to.

I'll have to respectfully disagree with this one. Everyone pulls in requests
the moment they have to make any kind of http request for the sole reason that
it's more ergonomic, not because it's "needed".

And requests brings in 4 of its own dependencies. Right there you've created a
prime chance for everything to go sideways (and I've watched it happen
explicitly with the requests library as it bolts on more and more ergonomic
features).

For what it's worth, the last web app I built had a DB library from the OS
vendor, flask, and gunicorn. All of which, since they were quite stable, never
introduced library conflicts.

------
linuxftw
I prefer virtualenv + pip.

I primarily develop on Fedora which ships both Python 2.7 and Python 3.6.

pip and virtualenv are built into python3, no need to install anything.
Additionally, these tools + tox are commonly used for python testing in
popular python projects [0]. I have found other colleagues that use other
tools struggle with pip and virtualenv, and it puts them at a disadvantage
when it comes to working with the larger python software community IMO.

[0]
[https://github.com/pallets/flask/blob/master/.travis.yml](https://github.com/pallets/flask/blob/master/.travis.yml)

------
meddlepal
I have added PyInstaller
([http://www.pyinstaller.org/](http://www.pyinstaller.org/)) to my toolchain
recently for working with Python app distribution. It gets me pretty close to
the Golang single distributable executable ideal... the main issue is needing
to build on each target OS, which kind of sucks but I can deal with.

~~~
MereInterest
I've used pyinstaller on Wine in order to make Windows executables from Linux.
Haven't figured out yet how to cross-compile for OSX yet, but that at least
reduces it from requiring three major OSes to build to two.

~~~
book_mentioned
Have you had a chance to evaluate [https://pybee.org](https://pybee.org) ?

src:
[https://news.ycombinator.com/item?id=16419628#16420689](https://news.ycombinator.com/item?id=16419628#16420689)

~~~
RightMillennial
Other than promoting Python and "diversity", what exactly does PyBee/BeeWare
do? It sounds like an SDK, but it appears to just point to a bunch of random
GitHub projects. It's not very clear to me.

------
zbentley
> I use multiple OSes: macOS at work, and Linux (well, Linux-ish - actually
> it’s WSL) at home.

This is pedantry, and the article is otherwise quite informative, but
describing WSL as "Linux-ish" is like describing a speedboat as "car-ish"
because they both happen to have steering wheels.

WSL is a bloody _marvel_ of engineering, but it is in no way an equivalent of
Linux. I'm mentioning this because WSL proponents _and_ detractors tend to
miss the fact that understanding those differences is critical to
understanding and using--even in trivial ways--WSL itself.

------
twexler
What seems to be missing in the comments here is the `--user` option to pip.
Lets you install modules on a per-user basis, doesn't mess with system python.
All you need to do is add the bin folder this creates to your path.

~~~
jonwayne
pipsi installs into isolated virtualenvs, then symlinks into `~/.local/bin`
just like `pip install --user`. Combining pipsi for cross-project tools (tox,
twine, nox, etc.) with pipenv for project-specific packages is all you need.
You don't need to `pip install --user` some package at that point.

------
raduhek
I didn't manage to get a run at pyenv, pipsi or pipenv, but for
virtualenvwrapper and tmux work great together.

I'm running linux at work and usually keep it alive for days, so tmux is used
for session keeping and remote work. I've created a few bash scripts which run
some tmux commands for setting up the layout as I want it and also execute
"workon" for the specific virtualenv.

For working on a new project, I've another bash script which I run with the
repo url and automatically clones it, creates the virtualenv with the same
name as the project, installs dependencies and starts a new tmux sessions with
two panes in first window and a second window for other stuff.

The great thing about tmux for me is it's low memory footprint so I can have
10-15 sessions running at a time, without worrying about the computer slowing
down. What takes a bit too long is setting it up again upon a restart.

~~~
foobarandgrill
>What takes a bit too long is setting it up again upon a restart.

Have you tried this tmux plugin? [https://github.com/tmux-plugins/tmux-
resurrect](https://github.com/tmux-plugins/tmux-resurrect) I use it all the
time at work and at home

------
miketery
Is there a command needed in pipenv like the one needed in virtualenv? eg.

    
    
      > source env/bin/activate
    

How does one activate one environment over another?

Why is pipsi a separate thing?

~~~
rovr138
Based on install.rst[0]

It seems it keeps a ledger somewhere (haven't dug into it). Then to run
commands, instead of using `python main.py`, you now use `pipenv run python
main.py` and it automates things. It still depends on Virtualenv.

As an alternative, Pyenv + Pyenv Virtualenv work by creating the environments
in a separate folder. You can then `cd` into a project root folder and there
use `pyenv local x` and every time you `cd` into the directory or a
subdirectory, it looks up the tree until it finds a `.local` file. This
specifies the environment. It can be a Python version or a Virtualenv and it
loads it.

[0]
[https://github.com/pypa/pipenv/blob/4f2295a1dbf7fe6fa36ef4ec...](https://github.com/pypa/pipenv/blob/4f2295a1dbf7fe6fa36ef4ec92b758acf60e1710/docs/install.rst)
[1]
[https://github.com/pypa/pipenv/blob/4f2295a1dbf7fe6fa36ef4ec...](https://github.com/pypa/pipenv/blob/4f2295a1dbf7fe6fa36ef4ec92b758acf60e1710/docs/install.rst#-installing-
packages-for-your-project)

------
collyw
This would have been useful if he compared them to virtualenv, which as far as
I am aware is still the "standard" way of managing python environments.

I tried virtualenv wrapper a while ago, and that was basically just another
set of commands to do the same thing as virtualenv. Having already learned
virtualenv command that gave me no real advantage. Are any of these tools
mentioned any different?

~~~
speedplane
Does virtualenv handle different python versions? My understanding was that it
just handled different sets of 3rd party packages.

~~~
vivan
Yes, you can make virtualenvs with different Python interpreters.

------
erokar
Every time I use Python I miss NPM and package.json.

~~~
timc3
Every time I use JavaScript/NPM/WebPack I miss my sanity.

~~~
jordic
last project i setup parceljs with yarn and is so easyyyy.. :) nvm is my day
saver :)

------
purplezooey
Seems like you could get all this done with conda. Just a thought.

~~~
odonnellryan
I'm all for trying new things, but yeah I did have a similar thought. Conda is
a good general-purpose tool that seems to check all these boxes and I've never
had any serious issues with it.

------
goodoldboys
as someone who has to manage projects for multiple clients, I find docker +
docker-compose to be the best solution. The overhead of docker is totally
worth it because the container separation makes life so much easier.

all I need to work on a client project is basically:

cd /path/to/project

docker-compose build (just once)

docker-compose up

~~~
is0tope
Absolutely. Though i have yet to have the chance to use it at work (large
enterprise), it has totally been a game changer for me in terms of creating
side projects in dev and seamless deployment. There is definitely some
overhead and not everything plays nice all the time, but the thought of going
back to running 5+ terminal windows / installing databases globally makes me
shudder.

------
vagab0nd
Forget python development, why is it so damn hard to USE python scripts in a
version-correct way in the first place? I have python scripts in my system
which rely on different python versions. And since I have a default python
version, 2.7 (through symlink or env or whatever), all the python 3 scripts
fail until I switch the default version manually (due to library dependencies,
print format, etc.). Why don't these scripts add "#!/usr/bin/python3" at the
top? And why is it so hard to just have 2 versions co-exist? I MUST be doing
something wrong here?

EDIT: "python3 <script.py>" doesn't always work because some scripts are
written in bash and they call python within the script.

------
mmsme
Pyenv looks great! Sad I didn't know about this sooner, though I do have the
luxury of using mostly one version of Python and have only confused it for the
system Python once or twice.

~~~
symmitchry
Pyenv is great. On our Macs we've had zero issues installing older, specific
versions of Python, every time. Highly recommended. (Getting it working
properly with zsh was a bit frustrating, but that's my own fault.)

~~~
tinymollusk
Do you know if it's easy to start using pyenv with existing projects, or
should I wait until my next de novo project?

~~~
bobwaycott
It’s pretty easy if you have a set of requirements handy and have `pyenv-
virtualenv`:

    
    
        $ pyenv virtualenv 3.6 some-name
        $ echo some-name > /path/to/project/.python-version
        $ cd /path/to/project
        $ pip install -r requirements.txt
    

That’s it.

------
speedplane
The first requirement in the article:

> I need to develop against multiple Python versions, including 2.7, various
> Python 3 versions (3.5 and 3.6, mostly), and PyPy.

The article states that this is an "unusual" setup, but unfortunately it is
all too common. The Python 2/3 split has created such a large and sad schism
in the development community, has wasted countless developer hours, and has
held back the language itself incalculably. A travesty.

------
oh-kumudo
Virtulenv + Pycharm, 3 years, happy as ever. For ML/DL stuff, Conda is also
highly recommended.

~~~
jordic
same here. Also I'm quite happy about the level of integration between pycharm
and frontend things (eslint, babel, react...) I never have to switch editors
again. I trend to have a bash script for sourcing node versions... setting the
correct path for node binaries, activate the env... and a makefile for
building dockers, I try to mantain all the tooling reproductable on the
project source.

------
sswaner
It it considered best practice or advisable to run production deployments in a
virtualenv? I have always considered it to be a tool for managing multiple
development efforts on the same machine, not as production environment
management.

~~~
mixmastamyk
Many prefer it, though it may be considered redundant in a container.

------
dangerboysteve
Would it be easier to use individual docker containers, each their own python
environment and then have your source directory mapped to a docker directory?

~~~
odonnellryan
I guess, if you are cool with sharing the images when you need to share the
environment. I think I just never liked the "bulk" that comes with Docker,
though it has gotten better.

I think Docker is cool in general but for other stuff than this specific use-
case.

~~~
jordic
i use docker for all the related services.. postgres, redis, solr, zeo... but
i use virtualenv and pip. I also use setup.py for declaring entrypoints and
coordinate all the services (i don't like too much the docker compose
aproach). also i'm using docker.py and boto3 for the devops part.

------
brailsafe
Thank you St. Patrick. I recently tried to install an ancillary python based
cli tool and realized my python dev environment on my mac is fucked.

------
ben_jones
I haven't had to write python in awhile but are there reasons to use
virtualenv etc. instead of a docker container?

~~~
oh-kumudo
Docker is SLOW to build the image. Virtualenv is great for development.

~~~
zeptomu
But you do not have to build the image while developing (in Python). Just map
your project into the image using `-v`.

------
nravic
Mine is comparably simpler. Emacs + Elpy, and I just pip and pyenv for
managing packages. Granted I don't deploy

------
avip
I don't understand this at all. It's 2018. My dev. env. for <insert thing> is
some text editor that knows how to "jump to definition" and "find all usages"
(this is sometimes referred to as "IDE"), and a bunch of Dockerfiles to build
and run the tests.

~~~
odonnellryan
So you're using Docker instead of virtual environments? I feel there is much
more overhead with Docker, but I could be wrong. Maybe I need to try it out
again!

~~~
scaryclam
No, it's pretty much the same as you remember it. Using docker instead of just
a normal virtualenv is overkill.

~~~
falcolas
Overkill - Yes and No. Yes, because you are spinning up a full VM to run a
docker container locally. No, because the container is also _the_ container
run in prod, with no opportunity for some other process to come along and hose
your otherwise clean install.

The process and FS isolation also make sysadmin-me all tingly inside. That way
_you_ can't hose up anybody else's clean install either (even if you're
compromised).

As a side note, on mac, the VM that runs docker containers requires less ram
and CPU than Hipchat. Go figure.

~~~
odonnellryan
I'd never deploy more than 1 client to a machine anyway, so isolation in a
security sense does not make much sense to me if I'm being honest.

But I understand. If the workflow works for you and/or your team, well, what's
the problem? It's working!

------
GrumpyNl
Im old school, i just install xampp and im ok.

------
mychael
This seems more like the 2008 Edition. You have to be pretty stubborn to not
adopt Docker in 2018.

~~~
throwawayfinal
If you're building a native python application, there's no reason to adopt yet
another technology.

A native python app in "Docker/Containers" is going to look like a dockerfile
that includes a copy of the app and runs three commands. Either you build that
on the server (what's the point) or use a registry (additional complexity for
little benefit).

------
cup-of-tea
I've been using a similar setup which I found online [0] when I was looking
for a way to have multiple Python versions including working Jupyter notebooks
etc.

It's been working great for me. Pyenv and virtualenvwrapper are really good.
I'm not sure why this one needs pipsi, though. You can install CLI tools for
both Python 2 and 3 using just Pyenv as demonstrated above.

[0] [https://medium.com/@henriquebastos/the-definitive-guide-
to-s...](https://medium.com/@henriquebastos/the-definitive-guide-to-setup-my-
python-workspace-628d68552e14)

------
mlevental
there are way too many python dep/env managers/things

pipenv

pyenv

mkvirtualenv

virtualenv

pipsi

venv

pew

conda

virtualenvwrapper

i'm sure i'm forgetting like 5.

this is like [https://xkcd.com/927/](https://xkcd.com/927/)

for the record i use pyenv and virtualenv (although playing with ML i'm using
conda)

~~~
yeukhon
As a Python developer, I just stay with pip and virtualenv, but I share the
smh/wtf sentiment here. The data science folks enjoy conda but I don't see
that being useful to me anywhere else. I feel like every other year someone
will invent a new _pip_ _env_.

 _pyvenv_ (see comment section, edited) is now deprecated and _venv_ is
recommended (shipped as part of Python 3.6 installation) is another confusion.
Lest not forget the confusion of distuilts and setuptools is like argparse vs
optparse in the past (which are both horrible). The experience of using pip
and pypi (now Warehouse) is much better than that of Ruby and of NodeJS, but
these "2-in-1" tools are just ridiculously "creative".

As always, pro-tip: consider using the following to ensure environment is
loaded properly when you are deploying production

    
    
        /full_path/env/bin/python myapp.py --workers=3
    

over

    
    
        source /full_path/env/activate && python myapp.py --workers=3
    

The latter is fine when you are doing local development in your terminals.

If you go on #python IRC channel, every year a group of helpers will
collectively recommend one of the above and then perhaps a different one the
following year, so please do yourself a favor, just stick to _pip_ and
_virtualenv_.

~~~
Cyph0n
Conda is especially useful if you want to use scientific packages on Windows.

~~~
yeukhon
Good point, thank you. I don't do much on Windows anymore. But a little search
of VS Studio (I know there are some core envelopers working for MSFT) yields
this: [https://stackoverflow.com/questions/15185827/can-pip-be-
used...](https://stackoverflow.com/questions/15185827/can-pip-be-used-with-
python-tools-in-visual-studio)

Probably exciting for VS users.

------
yorby
> and Linux (well, Linux-ish - actually it’s WSL) at home.

Is that really considered Linux at all? I would not say that Wine is Windows,
for example.

