Hacker News new | past | comments | ask | show | jobs | submit login

For packaging pains, do you mean the fact that there is no single agreed on packaging/provisioning method (easy_install, pip, conda, apt, etc.), or that these solutions are terrible?



- too many config files

- no clean way to freeze / update

- no good way to distribute a standalone executable

- binary packaging is still a mess

- big libs like GUI framewok are still terrible depend on

- compilation solutions like the awesome nuitka requires a lot of manual work

- too many things to know

For a beginer it's hell. I haven't been a beginer in 10 years, but I train people in Python for a living, and I know it's a pain point.


May I add:

- pip and setuptools maintainers break everything quite often*

- basically no mobile support what so ever for packaging

(* quite often you say, skeptically? Well, they completely screwed it up twice this year already. That's twice more than any other package manager I use)


I AM a Python beginner, and I was amazed recently to discover that there was no way for me to assemble an executable for an OS I'm not on that didn't involve buying a computer running said OS.

I'm on Win10, and pyinstaller made it easy to create an .exe for Windows, but I could find no way on Earth to assemble an executable for Mac.

As it happened I just asked a tech-savvy colleague on a Mac to use pyinstaller on their machine, and it worked, but still - I'm really impressed with Python generally, but this seems like a surprising and considerable oversight.


>I AM a Python beginner, and I was amazed recently to discover that there was no way for me to assemble an executable for an OS I'm not on that didn't involve buying a computer running said OS.

Are you a beginner in general maybe? Because that's either arcane, damn difficult to setup or nigh impossible in tons of other languages too -- even ones that actually do produce executables to work (which Python by default does not).

(And of course you can always just create the executable on a vm -- no need to buy a computer running the other OS).


20+ years of working in the tech industry and adjacent, so not so much, no.

But my work has, until this year, rarely involved compiling software for other people to use. So that may be the noobness you're detecting.


Basically, why I asked, is because you seemed to imply that you expected that cross compiling would be easy.

Whereas, from my experience, it's usually a pain in the ass to set up.


To be fair, Python programs aren't really designed to be compiled into a single executable. They are designed to be run through an interpreter. Therefore, multi-target compiling isn't really something the core developers focus on. The fact that you were unable to compile for multiple sources is more a failing of whatever third-party tool you were using.


Makes perfect sense, yes. And that's what I use Python for 99% of the time.

I just thought it was interesting quite how hard it turned out to be! As someone who's usually on the games/VR side of things, I may have taken some of the magic that Unity, for example, uses to compile to all sorts of platforms for granted.


If your into building games and want a language that can easily cross-compile to multiple platforms, you may want to take a look at Nim: https://nim-lang.org. You'll get better performance than with Python as well. It's definitely a little less intuitive than Unity although.


Not disagreeing with your main point, but it has historically been possible to run Mac OS in a VM. It might take some tinkering that is definitely not as straightforward and easy as some other OS's, and you might be breaking Apple's Terms of Service which require Mac OS to run only on Apple hardware, but it was usually possible.

Other operating systems apart from Mac OS should be much easier to run in a VM, and where they're not, I'd blame that more on the creators of the OS's themselves rather than on Python.


Ah, interesting. I did think of using a VM but my understanding was that Mac OS VMs were an absolute pain in the ass to set up ;)


This is one of the reasons golang has a growing share of popularity right now, a lot of people want to hit the linux compile target without a VM.

But hey, you can use VMs to hit a lot of these targets.


All but conda are pretty terrible. Conda seems to work, most of the time at least. The others seem to find interesting and novel ways to bork your system all the time. At one point a simple letsencrypt refresh ended up re-installing some critical python component eventually resulting in a full re-install of a server. That probably could have been avoided but the feeling of ease and convenience with which that simple operation resulted in a trashed server hasn't left me yet.


I know it doesn't address the root of the problem, but if you are running Python in production, you should always be using a virtualenv. Virtualenv is essentially a separate Python runtime with it's own packages and interpreter. You can have as many virtualenvs as you like on the same machine and have them all configured completely separately.

> At one point a simple letsencrypt refresh ended up re-installing some critical python component eventually resulting in a full re-install of a server.

The letsencrypt developers explicitly recommend using a virtualenv to run the certbot script.

The best part of virtualenvs is that you can almost use them like little containers. For instance, let's say you have a script running a web interface and another script that performs data calculations both on the same server.

Your web application can have its own virtualenv and be listening for proxied requests from an nginx instance. The data processing script can be running in it's own virtualenv as well and listening on a local unix socket for incoming data to process. The data processing script is 100% independent of the web application script and vice-versa. They each have their own interpreter and dependencies. Hell, you could even run your data processing script in Python 2 and your web application in Python 3 if you needed to.


> The letsencrypt developers explicitly recommend using a virtualenv to run the certbot script.

Yes, but virtualenv is not always available and updating a certificate should not require a large amount of software to be installed on the sly on a machine, it should just upgrade the certificate and be done with it.


How is virtualenv not always available?

``` pip install virtualenv

virtualenv -p python3 venv ```

If you have a heavily locked-down server or something, talk to the administrator.

> updating a certificate should not require a large amount of software to be installed on the sly on a machine, it should just upgrade the certificate and be done with it.

I respectfully disagree. First, updating a certificate can be done by hand without the use of any software. The point of letsencrypt is to automate the process. Automation requires software. If letsencrypt were written in C you would still need to ensure that the executable was compiled for your architecture and that you have the correct header files available in the correct locations.

I'm also not sure what you mean by "on the sly" here either. If we assume that you mean that the letsencrypt package automatically creates a virtualenv, how is this any different from postgres installing libxml2 as a dependency for example?


> ``` pip install virtualenv virtualenv -p python3 venv ``` Yes, if everything always worked as advertised that is how you would do it. Unfortunately it isn't.

> First, updating a certificate can be done by hand without the use of any software.

Yes, I'm aware of that.

> The point of letsencrypt is to automate the process. Automation requires software.

Exactly. So, how difficult can it be to upgrade a certificate that was already there, nothing on that machine needed 'upgrading' over and beyond the certificate, especially not without doing so in an irreversible way. All the software required to do the upgrade was in place because it worked 90 days before then.


> Yes, if everything always worked as advertised that is how you would do it. Unfortunately it isn't.

You'll have to explain. Any problems that arise would be problems that would arise with installing any package from any packaging system. I fail to see your point. Errors and bugs are always possible in any situation. This isn't really an argument against vurtualenvs, it's an argument against software in general.

> Exactly. So, how difficult can it be to upgrade a certificate that was already there, nothing on that machine needed 'upgrading' over and beyond the certificate, especially not without doing so in an irreversible way. All the software required to do the upgrade was in place because it worked 90 days before then.

When dealing with security measures such as SSL, it's extremely important that all packages involved in the process are secure and up to date, Therefore, it makes sense to me that an SSL library would want to ensure that all of it's dependencies have the latest bugfixes and security patches.


not op, but yes. all those things are what make python packaging more of a hassle than it should be.

virtualenv helped a lot when that came out but it still doesn't save you from shared system libraries and dependencies


I'd say a little bit of both.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: