1. Python has a low barrier to entry and is a popular first language, so most users don't realise how bad it is.
2. Python was originally popular with old-school sysadmins, Debian types, and a lot of its package management is based around that philosophy of carefully hand-tended servers shared by multiple users.
The pile of debootstrap chroot stuff that you get told to use works pretty similarly to virtualenv. It's pretty clunky and bolted-on, but so is the Python version.
Care to elaborate on what is bad? How bad are they compared to other languages? (Assuming you are always accepting some tradeoffs when moving from one eco system to another)
It's painfully archaic compared to languages that have all this sorted out.
Pip is a recipe for disaster, indicated by the huge amount of churn in the Python packaging sphere. It's constantly the worst part of my day whenever I pick a Python project up. Conflicting dependencies, non-deterministic installs, etc.
I used to cope with this crap fest until I tried Elixir and experienced the beauty of modern package management and build tools. One tool, Mix, that handles task running and dependency management with a proper resolver.
I honestly think even Node has a better package management and tooling story.
Also: virtual environments are a hack and a pain. Python is moving forward in this aspect with the recent PEP for the pypackages folder, but we're still a long way from adoption.
All of this stuff is painful for beginners. Virtual environments might seem easy to us, but I've had ridiculous amounts of trouble explaining why they're necessary to beginner developers. Then you have to explain `poetry` or `pip freeze`.
Even I have trouble coming back to older projects of mine and I'm an experienced Pythonista: I usually waste an hour or two trying to sort the pip dependencies out with whatever conflicts the resolver comes up with this time.
Python package management is not okay, we're all just used to coping with it. Other languages put it to shame.
Things continue to improve. For example, Pip's resolver has been deterministic since around November 2020. Bringing something minimal into Python that provides deterministic builds (e.g. pip-tools) would help packaging a lot.
- What you get when you import a library depends on state that's scattered all over the system: system-managed packages, pip-managed system-global packages, pip-managed per-user packages, which virtualenv is currently active, which directory you're currently in, which directory the program you're running is in, whatever it is that conda does....
- There's no concept of reproducible builds or dependency pinning. There's "pip freeze" but that's a one-time operation that you can't then reverse, so it's only usable for leaf applications. If you're developing a library, you'd better get used to having your transitive dependencies changed on you all the time. And since the whole ecosystem is built that way, even if you use some tool that lets you make stable releases of your library, that doesn't help you develop at all.
- Virtualenvs are stateful and attached to whatever terminal you were in at the time. This interacts hilariously with the previous point: if you accidentally run "cd myproject && pip install -r requirements.txt" in the wrong terminal, you permanently, irreversibly fuck up that virtualenv. All you can do is wipe it out and try to recreate it - but, per the previous point, it probably won't come out the same as before.
- You're supposed to use pip to manage which python version each project is using. But you're supposed to use the installer for it that's distributed with the python runtime. But only certain versions of the python runtime...
- There's only one global repository. If you want to build some libraries and reuse them the same way you'd use a normal library dependency, you have to publish them to the global PyPi. I think there might be an expensive service that works around this, but there's no repository program that you can just spin up on your own servers.
It's really a lot worse than other languages. If you build a real system (like, a couple of libraries and applications) in another language (not, like, C/C++ - but even Perl or TCL will prove the point) and then come back to Python, you'll find yourself hating it all the time.
> There's "pip freeze" but that's a one-time operation that you can't then reverse, so it's only usable for leaf applications.
What do you mean by reverse? You can certainly upgrade all packages locally and then run pip freeze again if you want to set up newer versions, or like manually change versions in your requirements.txt and `pip install --upgrade` to update them. For stronger reproducibility guarantees there's also `--require-hashes`, although admittedly freeze doesn't appear to support easily writing hashes to a requirements.txt.
> per the previous point, it probably won't come out the same as before.
I don't see how this follows. If you have frozen dependencies, it will.
> You're supposed to use pip to manage which python version each project is using. But you're supposed to use the installer for it that's distributed with the python runtime. But only certain versions of the python runtime...
No you use venv for that. Venv which has been available since python 3.3, in 2012, and on pip, ensurepip has been since 3.4, in 2014. If you're using a version of python that's 10 years old, I don't really know what to say.
> There's only one global repository. If you want to build some libraries and reuse them the same way you'd use a normal library dependency, you have to publish them to the global PyPi. I think there might be an expensive service that works around this, but there's no repository program that you can just spin up on your own servers.
> What do you mean by reverse? You can certainly upgrade all packages locally and then run pip freeze again if you want to set up newer versions, or like manually change versions in your requirements.txt and `pip install --upgrade` to update them. For stronger reproducibility guarantees there's also `--require-hashes`, although admittedly freeze doesn't appear to support easily writing hashes to a requirements.txt.
You need to know, and maintain, both what you intended to depend on and what you physically ended up depending on. So in more sensible ecosystems you will have, e.g., Gemfile and Gemfile.lock. pip freeze is, effectively, the way you create Gemfile.lock, but it forces you to destroy Gemfile to do it. And so you can't really use it.
> So in more sensible ecosystems you will have, e.g., Gemfile and Gemfile.lock. pip freeze is, effectively, the way you create Gemfile.lock, but it forces you to destroy Gemfile to do it.
There's plenty of legitimate criticism of python in general and pip in particular, but much of yours send to be criticism of things that are factually untrue.
> There's only one global repository. If you want to build some libraries and reuse them the same way you'd use a normal library dependency, you have to publish them to the global PyPi. I think there might be an expensive service that works around this, but there's no repository program that you can just spin up on your own servers.
> - There's only one global repository. If you want to build some libraries and reuse them the same way you'd use a normal library dependency, you have to publish them to the global PyPi. I think there might be an expensive service that works around this, but there's no repository program that you can just spin up on your own servers.
2. Python was originally popular with old-school sysadmins, Debian types, and a lot of its package management is based around that philosophy of carefully hand-tended servers shared by multiple users.