Hacker News new | past | comments | ask | show | jobs | submit login

What things there sound like steps backwards?

IIUC there are three "problems" raised here:

* With PEP 517 you can only get the files as a wheel or a tarball (I assume the latter refers to the sdist). The author dislikes having the compression done just to decompress again immediately. Fair enough, but this sounds pretty minor to me. Would there really be much time spent in this compress/decompress? Especially relative to everything else a gentoo update entails? Isn't this just a feature that can be added later, bringing a minor speed-up?

* PEP 517 doesn't support data_files. This is a surprising problem for a distribution maintainer to raise. I thought the whole objection to data_files was that it allows python package installations to stick files in arbitrary places, i.e. do stuff that should be the sole preserve of the system package manager. Why does the author want it to be allowed?

* distutils and "setup.py install" deprecation. I don't see the alternative here. Yes, it requires downstream changes, but proliferation of mechanisms without standardisation is the big problem with python packaging. Either it continues to be supported or it gets deprecated and removed. My vote would certainly be for the latter.

Nothing in the article even claims the changes aren't progress. The objections are about particular hassles that progress is generating for the author as a distro maintainer. And maybe they are valid but as far as I can see they are either quite minor or unavoidable if the state of the world is to improve.

I've been using python for ~20 years and it finally seems to be on a path towards some sort of sanity as far as packaging is concerned. Not there yet, by a long, long way, but heading in the right direction at least.

Edit: Oh, and one other problem: The lack of a standard library toml module when PEP 517 mandates the use of TOML. That is indeed mad. I don't understand how the PEP was approved without such a module being added.




> I thought the whole objection to data_files was that it allows python package installations to stick files in arbitrary places, i.e. do stuff that should be the sole preserve of the system package manager.

But those "arbitrary places" include things like the standard places where man pages and other documentation go, the standard places where shared data (and things like example programs) go, etc. Without data_files the Python installation tools give you no way to provide any of these things with your Python library.


Man-pages is a Linux thing, while python packages are cross-platform and can be installed on Windows.

It seems out of scope for a python packager to include such files. As a user i would also not be very happy to see a pip install start dropping files all over my system. How would that behave in a virtualenv anyway? Sandboxing should be a key feature of a package manager.


> Man-pages is a Linux thing, while python packages are cross-platform and can be installed on Windows.

Windows has help files, which are its version of man pages. So an installer that installed man pages on Linux would be expected to install the corresponding help files on Windows.

> dropping files all over my system

I said no such thing. I said there are certain designated places in a filesystem where certain common items like man pages (or Windows help files) live. Not to mention system-wide configuration files (which on Linux go in /etc), and I'm sure there are others I've missed. An installer that is not allowed to access those places does not seem to me to be a complete installer. Linux package manager installers certainly put files in such places. Windows installers do it too (although the specific items and places are different).

> How would that behave in a virtualenv anyway?

A complete virtualenv would have its own copies of the above types of files, in the appropriate places relative to the root of the virtualenv.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: