> Python distribution and packaging is just fundamentally horribly broken
It's clearly not because most people successfully use it fine.
The problem of distribution and packaging is often a matter of user expectations vs. the actual problem.
The user expectations is that Python is a high level language and will run the same across different machines regardless of OS and Hardware.
The actual problem is Python is a glue language often depending on lots of libraries that are sensitive to how they were compiled and what hardware they are targeting. So people can end up moving the the glue of the project first and then expect everything else to just work.
Things are getting better (e.g. https://lukeplant.me.uk/blog/posts/python-packaging-must-be-...), a lot of people have put a lot of work into standards and improving the ecosystem, but it is a hard problem that most other popular languages don't interface with nearly as much.
> It's clearly not because most people successfully use it fine.
Well, no because it's perfectly possible to successfully use a horribly broken system. I use Python "successfully", it just meant I have spend probably literal weeks of my life fighting pip and virtualenv and relative imports and finding I need flags like `--config-settings editable_mode=compat`.
> The problem of distribution and packaging is often a matter of user expectations vs. the actual problem.
Ha yes, I expect it to work reliably and simply and it doesn't!
> The actual problem is Python is a glue language often depending on lots of libraries that are sensitive to how they were compiled and what hardware they are targeting.
That's completely irrelevant to the kind of problems I was talking about. I run into issues with Python failing to compile C libraries relatively rarely! Even compiling Python itself seems to work quite well (maybe not surprising since that's one of the only ways to get a new version on Linux).
It's all the packaging infrastructure that's a mess. Pip, virtualenv, setuptools, and also the module import system is a total disaster. You don't see questions like this for Go:
> It's all the packaging infrastructure that's a mess. Pip, virtualenv, setuptools, and also the module import system is a total disaster. You don't see questions like this for Go:
I think you'll find the further you delve into it the less the problems are distinct, a lot of the issues with the module import system, pip, virtualenv, setuptools, etc. is because they are designed with having to support a vast range of things, from being depended on to interact with system libraries to downloading sdists and compiling arbitrary languages, etc.
Though the specific example you linked was largely solved with Python 3, there was a lot of confusion during the 2 to 3 transition because people had to support both behaviors, but most people don't have to think about Python 2 any more.
> Though the specific example you linked was largely solved with Python 3
I can assure you it absolutely was not.
> I think you'll find the further you delve into it the less the problems are distinct, a lot of the issues with the module import system, pip, virtualenv, setuptools, etc. is because they are designed with having to support a vast range of things, from being depended on to interact with system libraries to downloading sdists and compiling arbitrary languages, etc.
Not really. There are plenty of systems that have to "support a vast range of things" that aren't this bad.
In my opinion it's because the core Python devs didn't particularly care about the issue, never really tried to solve it, and as a result we have 10 incompatible half-baked third party solutions.
It's similar to the situation with C/C++ - worse in some ways, better in others (at least there is a de facto package registry in Python). In some ways it's because both languages are very old and predate the idea that packaging should be easy and reliable. That's fine, but please don't pretend that it is easy and reliable now.
The question, as posted, was a confusion about how Python 2 relative importing worked, which was indeed bad. I don't know what you think you are pointing out, you haven't said, and the question *is* about Python 2.
> Not really. There are plenty of systems that have to "support a vast range of things" that aren't this bad.
>
> In my opinion it's because the core Python devs didn't particularly care about the issue, never really tried to solve it, and as a result we have 10 incompatible half-baked third party solutions.
I agree that a lot of the solutions were created when there was not an understanding, or thought out design, on what would be a good packaging solution.
But these ill thought out solutions were exactly because of trying to support this wide range of situations, from working on weird OSes, to integrating with strange build systems.
However, what you seemed to have missed is there is now a first party standard on:
* How package installers (pip, poetry, PDM, etc.) should interact with package builders (setuptools, hatchling, etc.)
* How and where build configuration and project metadata should be stored (pyproject.toml)
Almost every popular Python package tool now supports these standards, meaning they all interact with each other pretty well.
Dropping all legacy configuration, and updating the standards to support edge cases is still a long road, but it is a road being travelled and things are getting better.
I don't disagree with what you're saying but that article seems odd. It's just a story of someone installing something once that didn't break immediately. Not only is it anecdotal, it doesn't even seem to confirm whether the pip story is getting better or if they just got lucky.
I agree, but there is no large scale study on this.
As someone who has managed Python distributions in a large company and who triages issues on the pip github issue page that my anecdotal experience is things are getting better.
The only hard statistic I can point to is the number of top packages on PyPI that offer wheels has substantially gone up, and is close to 100% in the top 500.
They use it fine by using Docker. So many major Python repos on GitHub come with a Dockerfile compared to, say, NodeJS. It's unfortunate, but having dealt with Python packages before, I don't blame them.
Lots of analysts, data scientists, traders, engineers, etc., use Python third party packages successfully, and have never touched, or maybe even head of, Docker.
And yeah, in general there are significantly less NodeJS third party packages interfacing with packages that directly depend on OSes and hardware. Python has many third party packages that are older than NodeJS that depend on foreign function interfaces, win32 com apis, directly talking to graphics shaders, etc.
Python packages are painful even if nothing native is involved. There are NodeJS packages that rely on native code too, difference is the packaging system is a lot simpler to use, though it's starting to get worse with the `import` vs `require` nonsense and Typescript.
It's clearly not because most people successfully use it fine.
The problem of distribution and packaging is often a matter of user expectations vs. the actual problem.
The user expectations is that Python is a high level language and will run the same across different machines regardless of OS and Hardware.
The actual problem is Python is a glue language often depending on lots of libraries that are sensitive to how they were compiled and what hardware they are targeting. So people can end up moving the the glue of the project first and then expect everything else to just work.
Things are getting better (e.g. https://lukeplant.me.uk/blog/posts/python-packaging-must-be-...), a lot of people have put a lot of work into standards and improving the ecosystem, but it is a hard problem that most other popular languages don't interface with nearly as much.