
Python setuptools v50 breaks pip installation on Debian/Ubuntu - mrlatinos
https://github.com/pypa/setuptools/issues/2350
======
echelon
If there is ever a Python 4, the number one goal of the project should be to
make Python completely hermetic, repeatable, and redistributable. And there
should only be one way to do it.

pip, requirements.txt, and venv are madness. Look at how much suffering they
cause! I was dealing with (different) pip issues today myself, and it's not an
infrequent occurrence.

Rust's Cargo is a good reference point.

~~~
guhcampos
In the replies we have many many solutions to this problem:

\- Conda \- Pipenv \- Pip itself \- Poetry

Which is quite symptomatic.

~~~
ur-whale
and none of which work reliably

~~~
mint2
In fairness, many of these are open source and at least in the case of pip,
for many years there was literally only two maintainers although now there a
couple more. This is what I learned at one of the python conferences.

It’s odd how open source packages that big companies use can be reliant on so
few people.

Edit: although checking the pip github, there more people committing so maybe
I got something wrong, but there is far more activity starting around 2019.

------
uranusjr
For context, the root cause to this is actually in Debian, not setuptools (or
pip). Debian vendor-patches the stdlib distutils, which setuptools used to
depend on, but lost in v50 since it does not depend on the stdlib distutils
anymore.

This issue was known for quite a while, and IIRC more recent Debian (and thus
Ubuntu) versions contain a fix. But the change was not backported for other
reasons I am not personally familar with.

~~~
Iv
While we are at it, can we finally get rid of python 2? And find some sane way
to not need python3.6 python3.7 and python3.8 laying around and installing
each their own libs versions?

~~~
simcop2387
I have done this on my laptop running debian nextstable, so it is happening
and will be the default soon.

~~~
Iv
Finally a good news!

------
bregma
Pip is already seriously broken, it's just that the breakage has hit one of
the main targets for the project.

Part of my day job is maintaining Python for an embedded OS. For self-evident
reasons, software for the embedded OS needs to be cross compiled, and Python
makes it impossible to cross-compile pip. Explicitly. Upstream's reaction is
"you can't cross compile pip, you're on your own." To make matters worse, pip
doesn't support non-native Python once it is built.

Of course, this is just the tip of the Python portability nightmare. Python is
great and completely portable as long as your target is Linux running on
x86_64 or aarch64. It has second-class support for Mac OS and Windows and
there are private port projects for a few other system but they generally
require heavy patching. For example, the Python code violates POSIX standards
in favour of assuming that everything is Linux (hey guys, time_t is not a
signed integer that can be used for arbitrary arithmetic!). Executing a whole
lot of system code between fork() and exec() is always broken, although it
appears to mostly work on Linux as long as you disable certain kernel
features, and guess which code base relies on that?

Yeah. Pip is broken and now it's showing up on Debian too. Maybe it'll just
get better by itself?

~~~
sfoley
Pip is a pure python library, you don’t compile it.

------
Waterluvian
It broke docker-compose today which messed up some robot deployment stuff for
an hour.

Lesson learned by that team: find a way to hard freeze the entire dependency
tree or don’t use a tool.

~~~
peterwwillis
Immutable Infrastructure as Code. Docker containers are the modern gold
standard. You can get fairly far on a stock OS with a combination of vendored
binaries/libraries and virtualenv/pip, but containers are much simpler.

~~~
Waterluvian
I dunno if your plan is containers all the way down but for us it was
literally docker-compose that was broken.

More important than immutable containers is repeatable builds. I need to be
able to go back to a version in 12 months and build the exact same output avec
a few changes. Containers don’t help with that.

~~~
zelphirkalt
I am quite fed up with people thinking: "Oh, I'll use docker compose, it'll ve
simpler." adding another layerof abstraction on top of something that does not
need another layer and the having issues, but not documenting their precise
docker run commmands, so that I need to reverse engineer them from the docker
compose yaml files.

------
Joker_vD
I remember there was time when pip3 used to break apt... or was it the other
way around? Or maybe it was pipenv? Whatever that was, it was pretty baffling
to see python's stacktrace of an ImportError when you tried to run "apt
whatever".

~~~
dheera
And then there was pip2, and now "pip" on my system actually runs pip3. It's a
mess.

~~~
josteink
While I'll be first in line to criticize the overall Python packaging-story,
you also have to consider how Python itself is being used to implement the
underlying platform which the OS's package-manager depends on.

If you allow the pip to make changes to things already present on your system,
it make break expectations other parts of your OS may have.

And especially given Python's messy package-management story, combined with
the less-than-ideal Python2 to 3 compatibility-story, transitioning a full OS
and platform from Python 2 to Python 3 without breaking stuff is no small
task.

I'm amazed it even works, and that's before you allow users to mess around
with packages using pip.

------
fractalb
I think the maintainers thought this is only Debian/Ubuntu problem, but in
reality, it affects a lot of other systems. Fedora, Conda distributions seem
to have been affected too.

------
laingc
The first thing I do (or rather, ansible does) on any new PC or server is
install Miniconda to a location inside my home folder, and set it as the first
item in my PATH.

I’ve been down most of the python installation and packaging rabbit holes, and
this practice has served me well.

~~~
ur-whale
The fact that anaconda actually has to hack your path variables and your
bashrc doesn't strike you as a recipe for disaster?

~~~
Redoubts
Yeah it always did squick me out that it needed to do that, while also having
a special binary manage all the commands. Explicitly, why is 'conda' in my
path, but then I need to run 'conda init $shell' to make it really work. It
should be handling that internally all by itself, without any help...

~~~
laingc
I don’t use this method. I set the path myself, and it works without anything
else.

~~~
Redoubts
Yeah, you can do that, but it rubs me the wrong way that that’s not
recommended and the tool will complain more often than not (even if it works)

------
rjeli
IMO, either use system python and system python packages (`apt install python-
opencv`), or if you need pip packages that aren’t debian packaged, use pyenv
python which takes care of all the pip prefixes. Mixing the two has never gone
well for me.

~~~
ohazi
That's not always possible. If you install a package from the system
repository that depends on a bunch of python packages, the dependency manager
will only be satisfied if they're installed via the repository, since it can't
see your pip packages.

So _you_ might need the latest python-opencv from pypi for work, and some
desktop camera app needs the one from apt to not have broken dependencies.

~~~
viraptor
You're not going to run into this issue though. Apt-installed python packages
will see each other and pip-installed venv packages will live in a separate
area. They don't conflict.

------
jnxx
From the github issue:

> In short, packaged Debian/ubuntu Python installs things into dist-packages.
> site-packages is left alone in-case you like to build your own upstream
> python and install it. Debuntu overrides the distutils it ships to redirect
> installs to dist-packages. It gets blurry when you install pip from
> upstream. But it would call setuptools, which would use the patched
> distutils so things worked.

>

> Now setuptools vendors distutils this and always installs things into site-
> packages, which isn't on the packaged python interpreter path, so it can't
> find modules.

Now, who thinks that vendoring distutils is a good idea? And installing into
the base system?

------
johnchristopher
There are things I can't uninstall or install because some needs python2 and
others python3 and uninstall python2 or python3 removes one or the other
(notably calibre, and I just read the author won't move to py3 and that he can
maintain py2 anyway).

For instance I can't install mycli without removing calibre.

Of course it's most likely possible to get around that with virtualenv or
whatever is used at the moment but that's a huge cognitive load just to run
some apps.

------
misnome
This update seems to have broken things all over. Our CI conda-based builds
all broke overnight as Conda-forge deployed, and then undeployed v50.

------
w_t_payne
And this is why I vendor every single dependency manually and build them all
from source.

------
XzAeRosho
How Python dependency management can be improved at this point? Should we hold
our breath for Python 4 to solve it?

~~~
corty
Just don't use pip, use apt. What isn't packaged isn't worth it anyways. Non-
distro package managers, a dozen per language are a disease.

~~~
jnwatson
apt is completely inappropriate unless you're using 3 year old dependencies.
Ubuntu 20.04 (released in April) doesn't have Python 3.8, which was released
in October 2019.

~~~
pritambaral
This is patently false; the default python3 on Ubuntu 20.04[1] is 3.8.2. A
separate package python3.8 is also available in built-in repositories for all
supported versions of Ubuntu going back all the way to 18.04[2].

1:
[https://packages.ubuntu.com/focal/python3](https://packages.ubuntu.com/focal/python3)

2:
[https://packages.ubuntu.com/search?keywords=python3.8](https://packages.ubuntu.com/search?keywords=python3.8)

------
pyuser583
Well, shit.

