Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's trivial to control your Python stack with things like virtualenv (goes back to at least 2007) and has been for ages now (I don't really remember the time when it wasn't, and I've been using Python for 20+ years).

What in particular did you find "bad" with Python tech stack?

(I've got my gripes with Python and the tooling, but it's not this — I've got bigger gripes with containers ;-))



> What in particular did you find "bad" with Python tech stack?

Stateful virtualenvs with no way to check if they're clean (or undo mistakes), no locking of version resolution (much less deterministic resolution), only one-way pip freeze that only works for leaf projects (and poorly even then), no consistency/standards about how the project management works or even basic things like the directory layout, no structured unit tests, no way to manage any of this stuff because all the python tooling is written in python so it needs a python environment to run so even if you try to isolate pieces you always have bootstrap problems... and most frustrating of all, a community that's ok with all this and tries to gaslight you that the problems aren't actually problems.


Sounds a lot like nitpicking, and I'll demonstrate why.

With docker containers, you can shell into it, do a couple of changes and "docker commit" it afterwards: similarly stateful, right? You resolve both by recreating them from scratch (and you could easily chmod -w the entire virtualenv directory if you don't want it to change accidentally).

The pattern of using requirements.txt.in and pip-freeze generated requirements.txt has been around for a looong time, so it sounds like non-idiomatic way to use pip if you've got problems with locking of versions or non-leaf projects.

As for directory layout, it's pretty clear it's guided by Python import rules: those are tricky, but once you figure them out, you know what you can and should do.

Can you clarify what do you mean with "structured unit tests"? Python does not really limit you in how you organize them, so I am really curious.

Sure, a bootstrapping problem does exist, but rarely do you need exactly a particular version of Python and any of the dev tools to be able to get a virtualenv off the ground, after which you can easily control all the deps in them (again a requirements-dev.txt.in + requirements-dev.txt pattern will help you).

And there's a bunch of new dev tools springing up recently that are written in Rust for Python, so even that points at a community that constantly works to improve the situation.

I am sorry that you see this as "gaslighting" instead of an opportunity to learn why someone did not have the same negative experience.


> With docker containers, you can shell into it, do a couple of changes and "docker commit" it afterwards: similarly stateful, right?

I guess theoretically you could, but I don't think that's part of anyone's normal workflow. Whereas it's extremely easy to run "pip install" from project A's directory with project B's virtualenv active (or vice versa). You might not even notice you've done it.

> You resolve both by recreating them from scratch

But with Docker you can wipe the container and start again from the image, which is fixed. You don't have to re-run the Dockerfile and potentially end up with different versions of everything, which is what you have to do with virtualenv (you run pip install and get something completely different from the virtualenv you deleted).

> you could easily chmod -w the entire virtualenv directory if you don't want it to change accidentally

But you have to undo it every time you want to add or update a dependency. In other ecosystems it's easy to keep my dependencies in line with what's in the equivalent of requirements.txt, but hard to install some random other unmanaged dependency. In the best ecosystems there's no need to "install" your dependencies at all, you just always have exactly the packages listed in the requirements.txt equivalent available at runtime when you run things.

> The pattern of using requirements.txt.in and pip-freeze generated requirements.txt has been around for a looong time, so it sounds like non-idiomatic way to use pip if you've got problems with locking of versions or non-leaf projects.

I've literally never seen a project that does that. And even if you do that, it's still harder to work with because you can't upgrade one dependency without unlocking all of your dependencies, right?

> As for directory layout, it's pretty clear it's guided by Python import rules

I don't mean within my actual code, I mean like: where does source code go, where does test code go, where do non-code assets go.

> Can you clarify what do you mean with "structured unit tests"?

I mean, like, if I'm at looking at a particular module in the source code, where do I go to find the tests for that module? Where's the test-support code as distinct from the specific tests?

> rarely do you need exactly a particular version of Python and any of the dev tools to be able to get a virtualenv off the ground

Whether virtualenv is available is a relatively recent change, so you already have a fractal problem. Having an uncontrolled way of installing your build environment is another of those things that's fine until it isn't.

> And there's a bunch of new dev tools springing up recently that are written in Rust for Python

Yeah, that's the one thing that gives me some hope that there might be light at the end of the tunnel, since I hear they mostly ignore all this idiocy (and avoid e.g. having user-facing virtualenvs at all) and just do the right thing. Hopefully once they catch on we'll see Python start to be ok without containers too and maybe the container hype will die down. But it's certainly not the case that everything has been fine since 2007; quite the opposite.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: