Hacker News new | past | comments | ask | show | jobs | submit login

What do you do when you accidentally run pip install -r requirements.txt with the wrong .venv activated?

If your answer is "delete the venv and recreate it", what do you do when your code now has a bunch of errors it didn't have before?

If your answer is "ignore it", what do you do when you try to run the project on a new system and find half the imports are missing?

None of these problems are insurmountable of course. But they're niggling irritations. And of course they become a lot harder when you try to work with someone else's project, or come back to a project from a couple of years ago and find it doesn't work.




>What do you do when you accidentally run pip install -r requirements.txt with the wrong .venv activated?

As someone with a similar approach (not using requirements.txt, but using all the basic tools and not using any kind of workflow tool or sophisticated package manager), I don't understand the question. I just have a workflow where this isn't feasible.

Why would the wrong venv be activated?

I activate a venv according to the project I'm currently working on. If the venv for my current code isn't active, it's because nothing is active. And I use my one global Pip through a wrapper, which (politely and tersely) bonks me if I don't have a virtual environment active. (Other users could rely on the distro bonking them, assuming Python>=3.11. But my global Pip is actually the Pipx-vendored one, so I protect myself from installing into its environment.)

You might as well be asking Poetry or uv users: "what do you do when you 'accidentally' manually copy another project's pyproject.toml over the current one and then try to update?" I'm pretty sure they won't be able to protect you from that.

>If your answer is "delete the venv and recreate it", what do you do when your code now has a bunch of errors it didn't have before?

If it did somehow happen, that would be the approach - but the code simply wouldn't have those errors. Because that venv has its own up-to-date listing of requirements; so when I recreated the venv, it would naturally just contain what it needs to. If the listing were somehow out of date, I would have to fix that anyway, and this would be a prompt to do so. Do tools like Poetry and uv scan my source code and somehow figure out what dependencies (and versions) I need? If not, I'm not any further behind here.

>And of course they become a lot harder when you try to work with someone else's project, or come back to a project from a couple of years ago and find it doesn't work.

I spent this morning exploring ways to install Pip 0.2 in a Python 2.7 virtual environment, "cleanly" (i.e. without directly editing/moving/copying stuff) starting from scratch with system Python 3.12. (It can't be done directly, for a variety of reasons; the simplest approach is to let a specific version of `virtualenv` make the environment with an "up-to-date" 20.3.4 Pip bootstrap, and then have that Pip downgrade itself.)

I can deal with someone else's (or past me's) requirements.txt being a little wonky.


> Why would the wrong venv be activated?

Because when you activate a venv in a given terminal window it stays active until you deliberately deactivate it, and one terminal and one venv looks much like another.

> I activate a venv according to the project I'm currently working on.

So just manual discipline? It works (most of the time), but in my experience there's a "discipline budget"; every little niggle you have to worry about manually saps your ability to think about the actual business problem.

> "what do you do when you 'accidentally' manually copy another project's pyproject.toml over the current one and then try to update?" I'm pretty sure they won't be able to protect you from that.

Copying pyproject.toml is a lot less routine than changing directories in a terminal window. But if I did that I'd just git checkout/revert to the original version.

> the code simply wouldn't have those errors. Because that venv has its own up-to-date listing of requirements; so when I recreated the venv, it would naturally just contain what it needs to.

So how do you ensure that? pip dependency resolution is nondeterministic, dependency versions aren't locked by default and even if you lock the versions of your immediate dependencies, the versions of your transitive dependencies are still unlocked.

> If the listing were somehow out of date, I would have to fix that anyway, and this would be a prompt to do so.

Flagging up outdated dependencies can be helpful, but getting forced to update while you're in the middle of working on a feature (or maybe even working on a different project) is rather less so. Especially since you don't know what you're updating - the old versions were in the venv you just clobbered and then deleted, so you don't know which dependency is causing the error and you've got no way to bisect versions to find out when a change happened.

> Do tools like Poetry and uv scan my source code and somehow figure out what dependencies (and versions) I need? If not, I'm not any further behind here.

uv has deterministic dependency resolution with a lock file that, crucially, it uses by default without you needing to do anything. So if you wiped out your cache or something (or even switched to a new computer) you get the same dependency versions you had before. There's no venv to clobber in the first place because you're not activating environments and installing dependencies - when you "uv run myproject" the dependencies you listed in pyproject.toml, there's no intermediate non-version-controlled thing to get out of sync and cause confusion. (I mean, maybe there is a virtualenv somewhere, but if so it's transparent to me as a user)

> I spent this morning exploring ways to install Pip 0.2 in a Python 2.7 virtual environment, "cleanly" (i.e. without directly editing/moving/copying stuff) starting from scratch with system Python 3.12. (It can't be done directly, for a variety of reasons; the simplest approach is to let a specific version of `virtualenv` make the environment with an "up-to-date" 20.3.4 Pip bootstrap, and then have that Pip downgrade itself.)

Putting pip inside Python was dumb and is another pitfall uv avoids/fixes.


>So just manual discipline? It works (most of the time), but in my experience there's a "discipline budget"; every little niggle you have to worry about manually saps your ability to think about the actual business problem.

>...but getting forced to update while you're in the middle of working on a feature...

I feel like trying to work on more than one project in the same session would require more such discipline.

>So how do you ensure that? pip dependency resolution is nondeterministic, dependency versions aren't locked by default and even if you lock the versions of your immediate dependencies, the versions of your transitive dependencies are still unlocked.

Ah, so this is really about lock files. I primarily develop libraries; if something breaks this way, I want to find out about it as soon as possible, so that I can advertise correct dependency ranges to my downstream.

The requirements.txt approach does, of course, allow you to list transitive dependencies explicitly, and pin everything. It's not a proper lock file (in the sense that it says nothing about supply chains, hashes etc.) but it does mean you get predictable versions of everything from PyPI (assuming your platform doesn't somehow change).

If I needed proper lock files, then I would take an approach that involves them, yes. Fortunately, it looks like I'd be able to take advantage of the PEP 751 standard if and when I need that.

>Putting pip inside Python was dumb and is another pitfall uv avoids/fixes.

Agreed completely! (Of course I was only using a venv so that I could have a separate, parallel version of Pip for testing.) Rather, the Pip bootstrapping system (which you can completely skip now, thanks to the `--python` hack) is dumb, along with all the other nonsense it's enabled (such as other programs trying to use Pip programmatically without a proper API, and without declaring it as a dependency; and such as empowering the Pip team to go so long without even as functional of a solution as `--python`; and such as making lots of people think that Python venv creation has to be much slower than it really does).

I'll be fixing this with Paper, too, of course.


> I feel like trying to work on more than one project in the same session would require more such discipline.

We all know that multitasking reduces productivity. But business often demands it (hopefully while being conscious of what it's costing).

You also don't have to be working in the "same session" to trip yourself up this way - "this terminal tab still has the venv from what I was working on yesterday/last week" is a way I've had it happen.

> I primarily develop libraries; if something breaks this way, I want to find out about it as soon as possible, so that I can advertise correct dependency ranges to my downstream.

If you want to find out as soon as possible, better to have a systematic way of finding out (e.g. a daily "edge build") than pick up new dependencies essentially at random.

> The requirements.txt approach does, of course, allow you to list transitive dependencies explicitly, and pin everything.

It allows you to, but it doesn't make it easy or natural. Especially if you're making a library, you probably don't want to list all your transitive dependencies or pin exact versions in your requirements.txt (at least not the one you're publishing). So you end up with something like two different requirements.txt where you use a frozen one for development and then switch to an unfrozen one for release or when you need to add or change dependencies, and regenerate the frozen one every so often. None of which is impossible, but it's all tedious and error-prone and there's no real standardisation (so e.g. even if you come up with a good workflow for your project, will your IDE understand it?).

> Fortunately, it looks like I'd be able to take advantage of the PEP 751 standard if and when I need that.

That's a standard written in response to the rise of uv, that still hasn't been agreed to, much less implemented, much less turned on by default (and unfortunately most of the time when you realise you need a lock file, you need the lock file that the first run of your tool would have generated when it was run, not the lock file it would generate now - so an optional lock file is of limited effectiveness). I don't think it justifies a "python packaging has never been a problem" stance - quite the opposite, it's an acknowledgement that pre-uv python packaging really was as broken as many of us were saying.


>None of which is impossible, but it's all tedious and error-prone and there's no real standardisation (so e.g. even if you come up with a good workflow for your project, will your IDE understand it?).

I mean, my "IDE" is Vim, and I'm not even a Vim power-user or anything.

People gravitate towards tools according to their needs and preferences. My own needs are simple, and my aesthetic sense is such that I strongly prefer to use many small tools instead of an opinionated, over-arching workflow tool. Getting into the details probably isn't productive any further from here.

>That's a standard written in response to the rise of uv

I know it looks this way given the timing, but I really don't think that's accurate. Python packaging discussion moves slowly and people have been talking about lock files for a long time. PEP 751 has seen multiple iterations, and it's not the first attempt, either. When uv first appeared, a lot of important people were taken completely by surprise; they hadn't heard of the project at all. My impression is that the Astral team liked it just fine that way, too. But it's not as if someone like Brett Cannon had an epiphany from seeing uv's approach. Poetry has been doing its own lock files for years.

>so an optional lock file is of limited effectiveness

The problem is that you aren't going to just get everyone to do everything "professionally". Python is where it is because of the low barrier to entry. A quite large fraction of Python programmers likely still don't even know what pyproject.toml is.

>I don't think it justifies a "python packaging has never been a problem" stance

That's certainly not my stance and I don't think it's the other guy's stance. I just shy away from heavyweight solutions on principle. Simple is better than complex, and all that. And I end up noticing problems that others don't, this way.


> People gravitate towards tools according to their needs and preferences.

Up to a point, but people are also nudged, not always consciously, by the reality of what tools exist in their ecosystem. The fact that Python makes "heavy" tools difficult to write and use is a significant factor in what many Python developers think is just a personal preference, IME. (I'd also argue that if you want to use lots of small tools you actually have more need for a standard format for your dependencies and your lockfile, since all the tools need to understand it).

> The problem is that you aren't going to just get everyone to do everything "professionally". Python is where it is because of the low barrier to entry. A quite large fraction of Python programmers likely still don't even know what pyproject.toml is.

Yes and no. I agree that many Python programmers aren't going to change the defaults and may not even know where their tool config file is. Any approach that requires extra effort from the user is not going to succeed. That's exactly why I think lockfiles need to be on by default, which is not something that has to make things harder for users (e.g. npm is a similarly beginner-first ecosystem but they have lockfiles and I've never seen it cited as something that makes it harder to get started or anything like that).




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: