
Goodbye Python Virtual Environments - rgacote
https://medium.com/@grassfedcode/goodbye-virtual-environments-b9f8115bc2b6
======
BerislavLopac
I'm curious how will this work with nested packages, i.e. a package inside a
__pypackages__ dir can easily have its own __pypackages__ and so on.

Could that be a solution for avoiding version clashes when specifying
dependencies? Right now, installing all the requirements on a legacy system I
have inherited gives me the following notification:

    
    
        package-xyz has requirement six==1.10.0, but you'll have six 1.12.0 which is incompatible.
    

With __pypackages__ it might be possible to install six 1.10.0 into
package_xyz/pypackages, and six 1.12.0 globally...

------
davvid
It's unfortunate that the solution for an inadequate toolchain is a brand new
workflow instead of improving the existing tools/workflows.

Python packaging is still a mess in 2019, and Python 2.7 is still alive and
dominating. Pypi download statistics show that Python2 is still the top cobra
with 60% of the download share. Thus, these new workflows unfortunately
splinter the python community even further unless they are able to target the
entire community.

Does this PEP improve support for doing things like creating rpm or deb
packages from virtualenvs? Virtualenvs aren't relocatable, which is one of the
sticking points that make them hard to work with a packager's perspective, and
it doesn't seem like this proposal improves that situation at all.

I don't see this as "Goodbye Virtualenvs" but rather, "here's a new workflow
to add onto the heap pile," where the end result is even more complicated
because now we have yet another workflow to know about.

If `__pypackages__` goes in the same directory as the script, and those
typically live in bin/, what happens when you have multiple tools in bin/ that
have different needs?

Would does a packager want? A dead simple way to create a suite of native OS
packages from a requirements.txt-like specification that install into a custom
filesystem prefix along with a custom package naming prefix. I can then take
this package and install it on any machine and don't need to have internet
access in order to install my dependencies.

I don't want to run virtualenv as an end-user consumer of the software -- I
want to install from a package repository, just like everything else. I don't
want things to litter stuff all over random directories without any way to
inspect what happened. Things like "pip install --user" are terrible from my
perspective.

Apt solved this issue decades ago. Reinventing that stuff is a recipe for
half-baked solutions.

What I don't want is a brand new system and new mechanisms that do not address
the fundamental challenges. What are those challenges? C-extensions that need
custom compilation options and all that stuff. The fact that setup.py's
C-extension mechanism is basically a build system suggests that maybe Python
should get out of that game and instead double-down on an existing solution
(e.g. cmake) and worry about creating building blocks that are easy to package
using native deployment tools (debian packages, rpms, etc) rather than trying
to invent a new system that solves the low-hanging fruit and ignores the
fundamental issues that got us here in the first place.

~~~
joshuamorton
Just fyi, using pypi download statistics is something that no one reputable
does. They're dominated by mirrors and automated tools, not by actual
downloads. [http://py3readiness.org/](http://py3readiness.org/) is what you
should look at, which shows that 359 of the 360 most popular packages support
Python 3, or have a drop in replacement that does. The one remaining, apache-
beam, is actively working on finalizing Python 3 support, (and in practice,
already does support Python 3).

Most people in python-land don't want to package their stuff as rpm or dep
packages because linux isn't the only environment they run on. When .debs can
be installed on OSX and windows, that might make sense, but until then, cross-
platform distribution is probably going to be seen as the superior way for
most devs.

>goes in the same directory as the script, and those typically live in bin/

For most python scripts, the thing in `bin/` is a stub that invokes a library
in site-packages or sometimes in a specific venv, depending on the install
method. So I'd expect that no, the script would keep doing what it does, and
__pypackages__ would exist inside of the python package eventually invoked.

>What are those challenges? C-extensions that need custom compilation options
and all that stuff.

These are fundamental challenges for only a very small subset of people (even
fewer now that the CFFI is a thing).

You're asking python to be nicer for your specific usecase which appears to be
python with C++ extension development on linux at the disadvantage of everyone
who isn't on linux and who doesn't write extension modules. (and even within
that area, I'd argue that cython or CLIF atop Bazel or Pants is still superior
to handwritten extension modules atop cmake, so you're asking python to
double-down on a not-the-best solution).

You're correct that they shouldn't splinter the community, but all of your
suggestions would do just that.

------
raviojha
"Learning curve", "Terminal isolation" and "Cognitive overhead"; well, none of
these are strong reasons, to be honest.

~~~
fourier_mode
Also the learning curve is not steep at all. Just 4 commands.(including
activate, deactivate)

