
Ask HN: How do you handle/maintain local Python environments? - PascLeRasc
I&#x27;m having some trouble figuring out how to handle my local Python. I&#x27;m not asking about 2 vs 3 - that ship has sailed - I&#x27;m confused on which binary to be using. From the way I see it, there&#x27;s at least 4 different Pythons I could be using:<p>1 - Python shipped with OS X&#x2F;Ubuntu<p>2 -     brew&#x2F;apt install python<p>3 - Anaconda<p>4 - Getting Python from https:&#x2F;&#x2F;www.python.org&#x2F;downloads&#x2F;<p>And that&#x27;s before getting into how you get numpy et al installed. What&#x27;s the general consensus on which to use? It seems like the OS X default is compiled with Clang while brew&#x27;s version is with GCC. I&#x27;ve been working through this book [1] and found this thread [2]. I really want to make sure I&#x27;m using fast&#x2F;optimized linear algebra libraries, is there an easy way to make sure? I use Python for learning data science&#x2F;bioinformatics, learning MicroPython for embedded, and general automation stuff - is it possible to have one environment that performs well for all of these?<p>[1] https:&#x2F;&#x2F;www.amazon.com&#x2F;Python-Data-Analysis-Wrangling-IPython&#x2F;dp&#x2F;1449319793<p>[2] https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;Python&#x2F;comments&#x2F;46r8u0&#x2F;numpylinalgsolve_is_6x_faster_on_my_mac_than_on&#x2F;
======
orf
Not a fan of Conda, the cli is terrible, their unofficial builds are opaque
and in my experience the very few packages that are not available on pypi are
rarely worth installing (usually the worst kind of "research code"[1]).

Simply using pipenv and pyenv is enough for me (brew install pipenv pyenv).
You don't ever have to think about this, or worry you're doing it wrong.

Every project has an isolated environment in any python version you want,
installed on demand. You get a lockfile for reproducibility (but this can be
skipped) and the scripts section of the pipfile is very useful for repetitive
commands. It's super simple to configure the environment to become active when
you move into a project directory.

It's not all roses, pipenv has some downsides which I hope the new release
will fix.

When I need a temporary environment to mess around in, then I use virtualfish
by running `vf tmp`:
[https://virtualfish.readthedocs.io/en/latest/](https://virtualfish.readthedocs.io/en/latest/)

1\.
[https://github.com/menpo/menpo/blob/master/setup.py#L17](https://github.com/menpo/menpo/blob/master/setup.py#L17)

~~~
gchamonlive
+1 for pipenv

I develop using pipenv and then deploy installing the dependencies directly in
the folder. It works for AWS lambda and containers, although lambda layers are
a little more tricky to release:

`pipenv lock -r > requirements.txt`

`pipenv run pip install -r requirements.txt -t .`

~~~
mcintyre1994
You might find AWS SAM CLI worth looking at if you're finding that awkward :
[https://docs.aws.amazon.com/serverless-application-
model/lat...](https://docs.aws.amazon.com/serverless-application-
model/latest/developerguide/serverless-sam-cli-install.html)

I used to have a Makefile to do the install dependencies in directory, test
locally, zip the directory up, deploy to AWS jobs - but SAM is a really good
replacement for that workflow. It uses docker containers too, and the other
big thing it makes trivial if you need it is running an API gateway locally so
you can test the Lambda as an HTTP API instead of just hacking events in as
standard input/function inputs.

------
cuchoi
This is has worked for me:

\- brew python for my global environment.

\- Create a virtual enviroment for each project (python -m venv).

\- Use pip install after activating it (source venv/bin/activate)

If you need to work with different versions of python replace brew python with
pyenv.

~~~
patrickserrano
This is my setup too and if you're using VS Code, their virtual env support
makes it pretty seamless[1].

[1][https://code.visualstudio.com/docs/python/environments](https://code.visualstudio.com/docs/python/environments)

~~~
alaithea
I was surprised at how well it worked when I tried it again recently after a
hiatus. Now I just wish they would get Docker-based environments working.

~~~
punjabisingh
VSCode already has it as a feature but it's a little rough around the edges.
It's called "Remote Containers".

Docs:
[https://code.visualstudio.com/docs/remote/containers](https://code.visualstudio.com/docs/remote/containers)

------
wenc
I just use Anaconda. It's basically the final word in monolithic Python
distributions.

For data science/numerical computation, all batteries included. It also has
fast optimized linear algebra (MKL) plus extras like dask (paralellization),
numba, out of the box. No fuss no muss. No need to fiddle with anything.

Everything else is a "pip install" or "conda install" away. Virtual envs? Has
it. Run different Python versions on the same machine via conda environments?
Has it. Web dev with Django etc.? All there. Need to containerize? miniconda.

The only downside? It's quite big and takes a while to install. But it's a one
time cost.

~~~
blendo
I chose Anaconda for this use case:

1 Setup a production Python environment on Windows Server 2016

2 Write a README on how to set up a prod Python env on Windows Server 2016
(pretty short)

3 Tell my Win 10 coworkers to refer to the README to setup their own Python
env

4 Minimize the number of "conda install" commands needed to setup a production
Python env, which in our case are mostly the Oracle/cx_oracle and MS SQL
Server dependencies.

------
amanzi
After using many different Python tools (Poetry, Pipenv, virtualenv, etc) I've
found that the most reliable method is to just use the standard `venv` command
to create virtual environments. I have the following `cvenv` ('create venv')
alias set up in my .bashrc to create new virtual environments:

    
    
        alias cvenv='python -m venv .venv && source .venv/bin/activate && pip install --upgrade pip setuptools > /dev/null'
    

Also, as others have mentioned, if you're not in a virtual environment, only
ever install pip packages to your user location, i.e. `pip install --user
--upgrade package_name`

But the first thing to do before any of that is to make sure you know which
python binaries you're running:

    
    
        which python
        which python3
        which python3.7
        etc...
    

You can use the following command to set your default Python: sudo update-
alternatives --install /usr/bin/python python /usr/bin/python3.7 9

(that sets python3.7 as the default when you just type `python`)

------
potta_coffee
I've been using virtualenv + virtualenvwrapper for years. I hear complaints
but I'm used to it now and it's pretty nice. Usually create a new environment
for each project and pip install my packages into it.

~~~
whalesalad
Yep. Python 3 installed via brew. Then every project has its own virtual
environment. You can specify the path to the python binary when creating a new
env so you can specify whether you want it to use Python 3 or Python 2.

It’s simple. It’s easy to debug. I genuinely don’t understand the point of all
this other nonsense like anaconda.

Unfortunately when it comes to data-science-Python it seems like the patients
are running the asylum. The APIs and developer ergonomics of tools like
tensorflow, pandas etc... are so bizarre to people who are running production,
vanilla Python.

~~~
jonnycomputer
pandas keeps turning me off, even if I feel like I ought to like it; I feel
like I never really know what is going on, and the syntax feels like a domain-
specific-language tacked on top of python, and it just doesn't sit well (my
opinion!). Besides... some of the functions have an insane number of
parameters, e.g.

pandas.read_table(filepath_or_buffer: Union[str, pathlib.Path, IO[~AnyStr]],
sep='t', delimiter=None, header='infer', names=None, index_col=None,
usecols=None, squeeze=False, prefix=None, mangle_dupe_cols=True, dtype=None,
engine=None, converters=None, true_values=None, false_values=None,
skipinitialspace=False, skiprows=None, skipfooter=0, nrows=None,
na_values=None, keep_default_na=True, na_filter=True, verbose=False,
skip_blank_lines=True, parse_dates=False, infer_datetime_format=False,
keep_date_col=False, date_parser=None, dayfirst=False, cache_dates=True,
iterator=False, chunksize=None, compression='infer', thousands=None,
decimal=b'.', lineterminator=None, quotechar='"', quoting=0, doublequote=True,
escapechar=None, comment=None, encoding=None, dialect=None,
error_bad_lines=True, warn_bad_lines=True, delim_whitespace=False,
low_memory=True, memory_map=False, float_precision=None)

I use R happily enough, and MATLAB too for that matter. Maybe, some-day, I'll
see the pandas-light.

~~~
prepend
Why do you not like so many parameters? They are optional so typically pandas
only has one or two required parameters.

~~~
jonnycomputer
Callables requiring many parameters is often indicative that the code is doing
too much, perhaps as coder after coder tweaked it to handle their particular
use-case.

I am not a big fan of (the over-use of) optional arguments either: I prefer
code I'm reading to be more explicit, which is more pythonic imo.

~~~
cometsong
I agree on _over-use of_ optionals.

I disagree with the code is thus doing too much, however.

At least in this specific and some other cases, the method shown there is a
wrapper to do a major task of converting data formats from one form to
another.

These optional, well-documented arguments show the defaults used, and are
modifiable in, as you said, specific use-cases.

------
gchamonlive
if on Linux, NEVER touch the system Python! If you can, you should use the
package manager of your distro to install the desired python version. If not
available, or you don't trust a third party ppa, compile from source and
install using make altinstall:

[https://docs.python.org/3/using/unix.html#building-
python](https://docs.python.org/3/using/unix.html#building-python)

Next, install pip for your user!

`curl [https://bootstrap.pypa.io/get-pip.py](https://bootstrap.pypa.io/get-
pip.py) -o get-pip.py`

`python get-pip.py --user`

Now you should have `python` which is the system one and `python3.x` for
alternate versions. Installing modules from pip can also be harmful, so always
use `pip install --user` for tools written in python (like aws cli). Those
tools are installed in ~/.local/bin, so make sure to add that to your PATH.

Next, you can use virtual environments to prevent your .local/bin from
becoming bogged, if you are dealing with many dependencies from different
projects. A nice virtual environment management tool helps a lot. Take your
pick, virtualenv + wrapper, pyenv, pipenv, conda... whichever you choose,
stick to virtualenv for development!

In a nutshell

1\. never touch your system python!

2\. install tools from pipy using pip's --user flag

3\. install different python versions using make altinstall, if building from
source

4\. virtual environments for development

This should be enough to keep your Linux installation healthy while messing
around with python.

Also, docker containers could simplify this whole process, although this is
another tech you are adding to your stack

~~~
Spivak
I wish Linux distros would take the system python out of the default path, put
it in /usr/libexec, and patch python dependent packages to use that path
manually for exactly that reason.

There desperately needs to be a separation between the Python that's intended
to support the OS and Python that's intended to be used by users. These should
have never been mixed and there's so much headache caused by it.

------
goerz
Either use Anaconda (if you’re using a lot of science libraries, or you also
want to install non-python stuff like git, pandoc, or nodejs), or pyenv, which
will install and manage multiple versions of Python for you in parallel. In
fact pyenv can also install Anaconda, so that’s another possibility. If you’re
just running Jupyter notebooks for doing some science or data analysis, I’d go
with Anaconda. If you do a lot of library development and testing, pyenv is
probably better.

Don’t use the system Python, brew-installed Python, it the official Python
installer, if you can avoid it (you’ll want your Python to be up to date and
self-contained in your home folder as much as possible)

------
erichurkman
I use direnv for most of my development that doesn't need full blown
containers. Once you hook it up, it automatically enables/disables virtual
environments based on directory.

[https://github.com/direnv/direnv/wiki/Python#pyenv](https://github.com/direnv/direnv/wiki/Python#pyenv)

~~~
zaro
yep, this is the most convenient and seamless way to use python virtual
environments. I also use it to specify node or java version per directory.

------
eropple
These days I use `asdf` for everything--think of it as a pyenv/rbenv/nvm for
Python, Ruby, Java, NodeJS, and a ton of other stuff. Through it I create
virtualenvs, each targeting a particular version of Python.

[https://github.com/asdf-vm/asdf](https://github.com/asdf-vm/asdf)

~~~
Rotareti
I second this. asdf is awesome! It works for all major languages.

In case of Python, I use asdf to install the needed interpreter version and I
use pipenv to create/handle the virtual environments. It's a perfect combo.

------
w8rbt
This is my biggest issue with Python. I absolutely love the language, but
Python 3 ought to be Python 3 no matter where/how you got it. If you are using
3.5 and I'm using 3.7 we ought to be able to share code and things just work.
Unfortunately, that is not the case.

~~~
jnwatson
Why would that not work?

~~~
torte
There are some breaking features from added in 3.7 which were not in the
language in 3.7. Meaning code written for the newer Python 3 version might not
work with the older Python 3 version. Most of those additions are relatively
easy to fix though.

Code from an older Python 3 version should always work in the newer Python 3
version though (at least for 3.5 -> 3.7).

~~~
mdtusz
You'd think, but it's not always the case. I've recently encountered an issue
with eventlet that is only present in 3.7, for example.

~~~
snorkasaurus
I agree. Try pyenv. It lets you install a particular version for use in a
shell account, then `python -m venv venv`. I like pip-tools for requirements
management, but some reasonable people disagree.

------
prepend
I use Conda for all environment management.

I recently had to reinstall a windows dev environment. I didn’t even install
standalone python, just conda.

For each new project I create a new conda env, activate it and the. Install
stuff with pip. I use pipreqs to create requirements files for the environment
before checkin to git to help others who may want to checkout and install.

I just prefer the cli and syntax of conda over the other environment managers.
I use pip to install packages over conda as it has more packages and I don’t
want to remember the Chanel names that may have the package I want.

------
thiht
As someone who writes the occasional Python script or service I was expecting
to find a definitive answer in this thread, but God it's such a mess... Venv,
virtualenvwrapper, pew, pyenv, poetry, pipenv, anaconda, conda... How are you
expected to chose?

~~~
m4r35n357
Easy - virtualenvwrapper!

All the talk about anaconda etc is for mad data scientists, not us normal folk
;)

------
colatkinson
Depends on the environment a great deal.

Linux: Typically using the package manager, or the AUR if I'm not using the
latest version. At the moment, I'm using 3.6 as my "universal" version since
that's what ships with current Ubuntu.

Windows: Good ol' fashioned executable installers, though there's really no
reason I don't use chocolatey other than sheer force of habit.

macOS: I don't do enough work on macOS to have a strong opinion, though
generally I just use the python.org installer. I don't think there's that much
of a difference from the Homebrew version, but I could be wrong on that front.
As an aside: IIRC, the "default" python shipped with macOS is still 2.X, and
they're going to remove it outright in 10.15. So I wouldn't rely on that too
heavily.

As for other tooling, IME pipenv and poetry make dealing with multiple
versions of Python installed side-by-side much easier. I have a slight
preference for poetry for a variety of reasons, but both projects are worth
checking out.

Finally, at the end of the day, the suggestions in here to "just use Docker"
aren't unreasonable. Performance between numpy on e.g. 3.6 vs 3.7 or clang vs
GCC likely aren't that significant, but if you create a Docker environment
that you can also use for deployment you can be sure you're using packages
compiled/linked against the same things.

If all of this sounds like an unreasonable amount of effort for your
purposes... probably just use Anaconda. It's got a good reputation for data
science applications for a good reason, namely removing much of this insanity
from the list of things you have to worry about.

------
abathur
* I haven't looked into it myself, so I can't really address your specific question about using fast/optimized libraries such as BLAS.

This doesn't (seem to) distinguish your system python (i.e., the ones that'll
run in a default execution environment if you run "python" or "python3") and
project-specific pythons.

For the latter, I guess it's _common_ to use something like virtualenv or a
container/VM to isolate this python instance from your system Python.

Personally, I use Nix (the package manager and corresponding packaging
language) for declaring both my python system and project environments.

Nix still takes some work to ease into, but I think your question suggests
enough understanding/need/drive to push through this. Costrouc
([https://discourse.nixos.org/u/costrouc](https://discourse.nixos.org/u/costrouc))
may be a good person to ask; I can recall him making a number of posts about
data science on Nix. I also noticed this one, which seems related to what
you're looking into: [https://discourse.nixos.org/t/benchmarking-lapack-blas-
libra...](https://discourse.nixos.org/t/benchmarking-lapack-blas-libraries-
and-compilers-with-nix/1176/3)

------
jchw
I am using NixOS, and on NixOS you can use Nix expressions to create an
environment with whatever packages at whatever version you need. It isn’t
perfect, but with some effort and tooling it is a reliable, flexible way of
managing an environment, and is especially nice in that it manages the exact
versions of C libraries as well. Nix can be used on any Linux or macOS system.

(FWIW: there are tools to create Nix expressions from requirements files, so
it isn’t necessarily all manual work.)

~~~
ben509
Have you seen any way to use Nix to test with tox?

------
currymj
I use conda because it’s the default way to install PyTorch. That’s pretty
much it. Pretty sure any NumPy install should use reasonable BLAS nowadays but
I believe conda includes Intel MKL which maybe makes a difference.

Every couple months I nuke all my environments and my install and start fresh
out of general frustration. It’s still the least bad for a scientific workflow
IMO (having different environments but all globally accessible rather than
stuck in specific directories is nice).

------
ltbarcly3
If you `pip install numpy` on most environments it will download an optimized
wheel. Anaconda also has an optimized numpy automatically.

I wouldn't use the python shipped with osx because it tends to be out of date
rather quickly, and it doesn't ship with libreadline.

Usually the python shipped with Ubuntu is good if it has the version you want.

If you compile it yourself from source you are in for a bad time unless you
know how to install all the dependencies. Make sure you turn on optimizations
when running ./configure or it will be significantly less fast.

Once you have python installed, install pip. Try the following until one works
(unless you use anaconda, then you won't use pip most of the time, and I think
you can just `conda install pip` if it's not there by default):

$ sudo pip install -U pip (for python3: sudo pip3 install -U pip)

$ sudo easy_install pip (easy_install-3 or something like that for python3)

(after, if those don't work, that you need to get it from somewhere else, most
likely by installing python-pip via apt or whatever the package is in brew or
whatever you are using)

When you want to install something globally, do NOT do `sudo pip install ...`,
this can break things in a way that is hard to repair, do `pip install --user
...`, and make sure the path that --user installs are installed in is added to
your $PATH. I think it's always ~/.local/bin/

If you are working on a project, always use a virtualenv, google for more
details on that.

~~~
xapata
Pip doesn't always give the most optimal build. But if you care about that,
you wouldn't be asking this question :-)

~~~
ltbarcly3
Pip gives you whatever the default build is, but for numpy they build
optimized builds into wheels for almost every platform.
[https://pypi.org/project/numpy/#files](https://pypi.org/project/numpy/#files)
For anything else, you have to get into the details of how it is built and
what the options are which is going to depend on the specific package.
Building numpy yourself is not beginner friendly, you have to know a ton of
details about linear algebra packages.

~~~
xapata
Conda is helpful for the odd cases where one C dependency conflicts with
another and you need to be careful specifying which build you want.

------
rstuart4133
For me it depends on what I'm doing.

\- If building a one off system it really doesn't matter, so it's whatever
happens to be easiest on the system I'm deploying it on.

\- If I'm adding software to a production system, especially one that will be
exposed to the outside world then it's use the distro libraries - no if's or
butts. The reason is straightforward: I _must_ get automatic security updates
to all 3rd party software I install. I do not have the time to go through the
10's of thousands of CVE's released each year (that's 100 a day), then figure
out what library it applies to, then track when they release an update.

\- The most constrained is when I am developing software I expect others I
have never met will be using. E.g. stuff I release as open source. Then I
target a range of libraries and interpreters, from those found on Debian
stable (or even old-stable occasionally) to stuff installed by PIP. The reason
is I expect people who are deploying to production systems (like me) will have
little control over their environment: they must use whatever came with their
distro.

------
boston_sre
pyenv is a tool to manage python environments. it lets you install and setup
whatever python version you want with a simple command. you can then switch
the active python version globally or just in the current shell with a simple
command. it also has some virtualenv plugins.

[https://github.com/pyenv/pyenv](https://github.com/pyenv/pyenv)

------
n8henrie
No windows, but on Linux and MacOS, I use the package manager or brew to
install a system Python3, from which I derive venvs for basically all
installations of new tools. I use one of the install flags to include the
Python version in use in the prompt.

In addition, I use pyenv (pyenv) for development. I always install the newest
Python version with eg `CONFIGURE_OPTS="\--enable-optimizations" pyenv install
3.7.4` which IME consistently produces a ~10% faster Python compared to the
various package managers. This makes it really simple to include in each
project a `.python-version` file that tox will happily use to facilitate
testing on multiple versions. As above, each project gets its own venv from
the pyenv-based Python (I usually do the development using the most recent
stable, but if you wanted to use a different version for the venv and
development and just test against the most recent stable with tox that should
also work).

------
lervag
I've used pew [0] a couple of times, which feels like it works quite well. It
is a nice alternative to virtualenvwrapper.

However, my personal struggle with Python projects is when I work on projects
that I also use my self on a daily basis. I have a tool that I wrote for
automating some administrative stuff at work, and every so often I make some
minor fixes and updates. So, I both use it and develop it at the same machine.
I've never figured out how to properly do this with a virtualenv. I.e., I want
the tool, say "mypytool", to be available and run the most recent version in
any shell. At the same time, I would prefer not to install the requirements
globally (or for that matter, at the user level). I would love to hear some
suggestions on how to solve this use case.

[0]: [https://github.com/berdario/pew](https://github.com/berdario/pew)

------
photonios
This works for me. But I am crazy.

\- Install Python through Brew.

\- Upgrade pip using pip. Install all further libraries using pip. Never
install Python packages through brew.

\- Use a plain virtualenv per project/repo.

A long time ago I wrote some bash that I keep in my `.bashrc` that overrides
the `cd` command. When entering a directory, my little bash script iterates
upwards in the directories and activated any virtualenv it finds. I also
override the `virtualenv` command to automatically activate the env after
creating it.

I am aware there are existing tools out there that do this. Pyenv does this
right? But I never got used to them and this keeps things simple enough for
me. I cannot forget to enter a virtualenv.

As I said, I am probably nuts. I also don't use any form of IDE. Just plain
VIM.

~~~
skiman10
Doing things differently doesn't make you nuts.

Do you mind sharing your dotfiles? This is really interesting to me.

~~~
photonios
Sure! I have a Gist here with this little snippet:

[https://gist.github.com/Photonios/4a17d6c952d85cc23234b4394b...](https://gist.github.com/Photonios/4a17d6c952d85cc23234b4394b9becfc)

Hope it proves useful :)

------
Seirdy
To add to all these comments:

For the specific use-case of installing executable Python scripts, pipx [0] is
the way to go. It creates a virtual environment for each package, and lets you
drop a symlink to the executable in your `$PATH`. I've installed 41 packages
[1] this way.

[0]:
[https://pipxproject.github.io/pipx/](https://pipxproject.github.io/pipx/)

[1]:
[https://gitlab.com/Seirdy/dotfiles/blob/master/Executables/s...](https://gitlab.com/Seirdy/dotfiles/blob/master/Executables/shell-
scripts/bootstrap/python.sh)

------
sifer
Note that my comment is in reference to Python3, since Python2 will not be
maintained past 2020.

If you’re on a Mac, just use the brew installation (brew install). If you’re
on some type of prod/containerized setup, use apt’s python (apt-get install).

I would not recommend building Python from source unless you _really_ know
what you’re doing on that level, as you can unintentionally shoot yourself in
the foot quite a bit. From there just using a virtualenv should be pretty
straightforward.

In this way, you’re letting the package managers (written by much smarter
people than you and I) do the heavy lifting.

------
DeadBabyOrgasm
I work on OSX and Ubuntu (different workstations, same setup scripts) and
python setup has been frustrating for me. If brew updates the version of
python, suddenly all of my virtualenvs seem to break.

What I've settled on is to use a python version manager (pyenv is the least
intrusive balanced with most usable) and using direnv to create project-
specific python environments. Adding `use python 3.7.3` to an `.envrc` in a
directory will make it so cd-ing into it will create a virtualenv if it
doesn't yet exist and use the pyenv-installed python at 3.7.3.

------
richk449
Can someone explain the benefit of using a separate env for each project? Why
not just use the default env for all your projects? Does it have something to
do with the ops part of devops?

~~~
amanzi
Nothing to do with DevOps, more to do with keeping your different projects
separate with their own dependencies. That way, if you need to move the
package to another machine, you can just do a `pip freeze` to document which
packages you need, then `pip install` them on the new machine.

If you just want a sandbox area where you'll be developing lots of little
scripts and bits and pieces, there's nothing stopping you from having a single
environment that you keep adding to as you go. You just miss out on the
portability aspects mentioned above and potentially risk getting conflicts
over time as the number of packages you install grows and grows.

------
tcbasche
\- I use the brew python. Mostly for web dev, other times for pandas/numpy

\- pip + requirements.txt I find is more than acceptable

\- wheels > source when it comes to installing / distributing packages

------
Nextgrid
I install the latest Python 3, PyEnv & PyEnv-Virtualenv through Homebrew (the
latter 2 require a Python to run, so the Homebrew-installed Python 3 is used
only for those), then use PyEnv to manage additional Python versions &
virtualenvs.

I was a huge fan of pipenv in the beginning but it seems to have stagnated as
2 years later it's still slow as hell. I now deal with pip & pip-compile to
pin versions.

------
rcarmo
I just use pyenv everywhere (Mac, Linux, Windows/WSL) and install Anaconda
through that. Simplest and most uniform setup possible.

------
varusr
I have created following function in ~/.bash_functions

I use it as pvenv project_dir_name

function pvenv() {

    
    
            mkdir $1                              
            cd $1                                 
            python -m venv  .                     
            source ./bin/activate                 
            pip install --upgrade pip setuptools  
    }

------
jonnycomputer
I work on Mac and on Linux. I used to use macports for python (sorry, I'm
opinionated, I won't touch brew), along with either venv or virtualenv. But
I've found that working within the Anaconda system for python has been the
best experience overall. I've found conda environments to be a pretty clean
solution.

------
Saphyel
Docker, your dockerfile has all you need and very specific version... why
would you waste time on other stuff?

------
liyanage
I install the macOS distribution from python.org/downloads/. I use that for
general Python 3 projects and it's been working great.

Separately I also use anaconda if I need all the scientific packages + Jupyter
etc.

------
TYPE_FASTER
brew install python pip install pyenv pyenv install <version> pip install
virtualenv-wrapper mkvirtualenv —python=/path/to/python/bin foo Repeat the
last step for each project

------
bitcurious
I previously used conda, now switching away from it as I noticed pretty bad
dependency decay. Super frustrating when it can’t rebuild an environment from
just two years ago.

------
georgehdd
1\. pyenv - for managing multiple python versions

2\. pipenv (with PIPENV_VENV_IN_PROJECT env var set) for managing project
dependencies / venv

That's basically about it.

------
mekane8
Not an exact answer to your question, but another data point: I use PyCharm.
it handles everything pretty seamlessly. I really like it.

------
vs4vijay
Have you tried Poetry?
[https://poetry.eustace.io/](https://poetry.eustace.io/)

------
vkaku
Here's my simple Python list:

1\. Pick Python binary / Official Docker Image

2\. Use virtualenv for Developer Boxes

3\. Well defined Pipfile / requirements.txt

4\. Avoid most binary package building. Prefer those with readymade wheels.
But if it has to be built, I prefer to make static libraries, use them to make
static ELFs / modules, as opposed to /path/to/some.so, and bundle them / ship
them as wheels for both development / production.

Again, you can set up simple Dockerfiles / scripts to do this for you.

~~~
PascLeRasc
Thanks, so what you're saying is that you can cook up the "proper" environment
from any binary source, right? Do you have or know of an example for doing
this? Do you do it uniquely for every project?

~~~
vkaku
I do this. I do it uniquely for every project.

These reasons:

\- I can't use one version of a binary distro for production, because fixes
keep coming.

\- Every company / people I work with uses different *nix distro, so the
initial setup can look different, Esp when using bundled pythons.

\- Irrespective of that, the final outcome should be normalized and sane. This
is where virtualenv, tox - these kinds of things help.

\- Source standardization is quite easy with Pipfile, so I pretty much rely on
that mechanism.

\- I have enough experience building libraries (shared and dynamic) and quite
aware of ELF and PE issues that sometimes, custom solutions are needed. I
avoid them as much, but sometimes they are needed. Why?

Maybe you don't have an Intel CPU (so CPUID based checks break) , or maybe you
want to run TF under ROCm and not CUDA, or you're trying to compile a new
shared library for RPi because nobody builds armv7l wheels unless they use a
Pi, Or you need to link against OpenSSL/libxml for that wsgi based project -
and keep runtime/image sizes small on a container.

In these specific cases, more often than not, stay away from anyone who
promises you that distro X would / has always worked for them.

------
lpellis
pipenv works well enough for me, it creates a lock file so you get
deterministic builds. Plus its supported by heroku if you ever decide to host
with them. [https://github.com/pypa/pipenv](https://github.com/pypa/pipenv)

------
whitebeatle
pyenv for python versions. and then venvs for each project to contain its
packages from other projects. But to this day i struggle with polluted
python/3 and pip/3 on my bash thanks to native mac install, brew and pyenv
versions.

------
vincentmarle
Obligatory xkcd: [https://xkcd.com/1987/](https://xkcd.com/1987/)

------
smitty1e
At that rate, why not just install the toolchain and compile from source?

------
timkofu
I install Python via PyEnv; then for every project, I use PipEnv.

------
jnwatson
Just use pyenv. Works everywhere.

------
enz
pyenv/virtual environments

------
xapata
Miniconda.

~~~
gojomo
This, to explicitly choose which packages to install per environment.

And, beyond numpy/scipy (where the conda versions might be better optimize),
often pip-install packages from PyPI into those conda environments.

