I highly, highly recommend uv. It solves & installs dependencies incredibly fast, and the CLI is very intuitive once you've memorized a couple commands. It handles monorepos well with the "workspaces" concept, it can replace pipx with "uv tool install," handle building & publishing, and the docker image is great, you just add a FROM line to the top and copy the bin from /uv.
I've used 'em all, pip + virtualenv, conda (and all its variants), Poetry, PDM (my personal favorite before switching to uv). Uv handles everything I need in a way that makes it so I don't have to reach for other tools, or really even think about what uv is doing. It just works, and it works great.
I even use it for small scripts. You can run "uv init --script <script_name.py>" and then "uv add package1 package2 package3 --script <script_name.py>". This adds an oddly formatted comment to the top of the script and instructs uv which packages to install when you run it. The first time you run "uv run <script_name.py>," uv installs everything you need and executes the script. Subsequent executions use the cached dependencies so it starts immediately.
If you're going to ask me to pitch you on why it's better than your current preference, I'm not going to do that. Uv is very easy to install & test, I really recommend giving it a try on your next script or pet project!
The script thing is great. By the way those 'oddly formatted' comments at the top are not a uv thing, it's a new official Python metadata format, specifically designed to make it possible for 3rd party tools like uv to figure out and install relevant packages.
And in case it wasn't clear to readers of your comment, uv run script.py creates an ephemeral venv and runs your script in that, so you don't pollute your system env or whatever env you happen to be in.
I generally agree but one thing I find very frustrating (i.e. have not figured out yet) is how deal with extras well, particularly with pytorch. Some of my machines have GPU, some don't and things like "uv add" end up uninstalling everything and installing the opposite forcing a resync with the appropriate --extra tag. The examples in the docs do things like CPU on windows and GPU on Linux but all my boxes are linux. There has to be a way to tell it that "hey I want --extra GPU" always on this box. But I haven't figured it out yet.
Getting the right version of PyTorch installed to have the correct kind of acceleration on each different platform you support has been a long-standing headache across many Python dependency management tools, not just uv. For example, here's the bug in poetry regarding this issue: https://github.com/python-poetry/poetry/issues/6409
As I understand it, recent versions of PyTorch have made this process somewhat easier, so maybe it's worth another try.
This happened to me too, that is why I stopped using it for ML related projects and stuck to good old venv. For other Python projects I can see it being very useful however.
As a person who don’t work often on python code but occasionally need to run server or tool I find UV blessing.
Before that I would beg people to help me just not to figure out what combination of obscure python tools I need. Now doing “uv run server.py” usually works.
Can confirm this is all true. I used to be the "why should I switch" guy. The productivity improvement from not context switching while pip installs a requirements file is completely worth it.
That scripting trick is awesome! One of the really nice things about Elixir and its dependency manager is that you can just write Mix.install(…) in your script and it’ll fetch those dependencies for you, with the same caching you mentioned too.
Does uv work with Jupyter notebooks too? When I used it a while ago dependencies were really annoying compared to Livebook with that Mix.install support.
uv offers another useful feature for inline dependencies, which is the exclude-newer field[1]. It improves reproducibility by excluding packages released after a specified date during dependency resolution.
I once investigated whether this feature could be integrated into Mix as well, but it wasn't possible since hex.pm doesn't provide release timestamps for packages.
Uv really fixes Python. It takes it from "oh god I have to fight Python again" to "wow it was actually fast and easy".
I think all the other projects (pyenv, poetry, pip, etc.) should voluntarily retire for the good of Python. If everyone moved to Uv right now, Python would be in a far better place. I'm serious. (It's not going to happen though because the Python community has no taste.)
The only very minor issue I've had is once or twice the package cache invalidation hasn't worked correctly and `uv pip install` installed an outdated package until I `uv clean`ed. Not a big deal though considering it solves so many Python clusterfucks.
Agree. I mostly do front end in my day job, and despite JavaScript being a bit of a mess lang, dealing with npm is way better than juggling anaconda, miniforge, Poetry, pip, venv, etc depending on the project.
UV is such a smooth UX that it makes you wonder how something like it wasn’t part of Python from the start.
…but we did have to wait for cargo, npm (I include yarn and pnpm here) and maybe golang to blaze the ‘this is how it’s done’ trail. Obvious in hindsight.
I had to give up on mypy and move to pyright because mypy uses pip to install missing types and they refuse to support uv. In the CI pipeline where I use UV, I don't have a pip installed so mypy complains about missing pip.
Of course I can do it by myself by adding typing pkgs to requirement.txt file then what's the point of devtools! And I don't want requirements.txt when I already got pyproject.toml.
Once you get used to cargo from rust, you just can't tolerate shitty tooling anymore. I used to think pip was great (compared to C++ tooling).
I think their only big gap is the inability to alias general project non-python scripts in uv. This forces you to use something like a justfile or similar and it would be much more ergonomic to just keep it all in uv.
The risk is obviously uv losing funding. I kinda hope the PSF has thought about this and has a contingency plan for uv winning and dying/becoming enshittified soon after.
Every time people have debates over the merits of languages I always put developer environment at the top of my list. Build tools, IDE, readable stack traces. Those things boost productivity for more than concise list comprehensions or any gimmicky syntax thing. It's why Python always felt stone age to me despite have such lovely semantics.
UV is such a big improvement that it moves Python from my "would use again if I had to, but would really not look forward to it" pile to my "happy to use this as needed" pile. Without disparaging the hard work by many that came before, UV shows just how much previous tools left unsolved.
I am pretty happy with poetry for near future. I prefer using python interpreters installed by linux package manager. In cloud I use python docker. Poetry recently added option to install python too if I changed my mind.
I have already setup CI/CD pipelines for programs and python libraries. Using uv would probably save some time on dependency updates but it would require changing my workflow and CI/CD. I do not think it is worth the time right now.
But if you use older environments without proper lock file I would recommend switching immediately. Poetry v2 supports pyproject.toml close to format used by uv so I can switch anytime when it would look more appealing.
Another thing to consider in long term is how astral tooling would change when they will need to make money.
How does this interact with your code editor or IDE? When you edit the file, where does the editor look for information about the imported third-party libraries?
For my use cases, uv is so frictionless it has effectively made Python tolerable for me. I primarily discovered it via Simon Willison's (@simonw) blog posts[1]. I recommend his blog highly.
I want to switch to uv from pyenv but one use case that didn't manage to figure out is if I can have similar setup like pyenv that I install few python version and setup one to be a global default (configured in zsh). I know for bigger projects proper way is to setup virtual environment for all new project but I do many mini (throwaway) python scripts and experiments or testing repos in python and would be really annoying to setup environment for those - so far pyenv worked well for me for such cases without having pretty much dependency conflicts.
Yes but I then still then have to declare all dependencies for all tiny throwaway script, right now I have global python in pyenv and installed tons of plugins and didn't have too much issues with conflicts so was good enough for me
ideally mise could be replaced entirely by uv or at least just be a thin wrapper around uv (in some ways that's already the case), but given this article requires the use of the custom uv-python-symlink utility it seems uv isn't quite there yet
the reality that I'm sure you've heard me say many times is that I'm just not a python dev and astral is likely always going to build a better solution around python than I ever could. They've just focused a lot more on the package manager side of things than the runtime/venv management side so far but I suspect that will change—and given astral's velocity I doubt we'll be waiting long
and btw mise's venv support isn't going anywhere probably ever, but I do hope that at some point we could either let uv do the heavy lifting internally or point users to uv as a better solution
Forgot about that! Yes, another significant benefit of why we use mise.
In particular, we use flask-vite and it's so nice to be able to have the right version of Node specified in the same management system as we specify the Python version. This solved a not insignificant amount of angst around FE development for me personally since I spend most of my time in the BE.
It's not like it was insurmountable before. But now, with mise, it's in that "just works" category for me.
100% agreed, it just takes a task that was a 10-15min setup depending on your environment and personal knowledge to a 2min thing. It just makes life easier and it puts the bar for starting lower, a win in my book =)
I converted along with most of the people in this thread.
IMO no really hard problem is ever truly solved but as can be seen in other comments, this group of people really crushed the pain of me and *many* others, so bravo alone on that - you have truly done humanity a service.
> Maybe I installed some other things for some reason lost in the sands of time.
FWIW, I was able to confirm that the listed primary dependencies account for everything in the `pip freeze` list. (Initially, `userpath` and `pyrsistent` were missing, but they appeared after pinning back the versions of other dependencies. The only project for which I couldn't get a wheel was `python-hglib`, which turned out to be pure Python with a relatively straightforward `setup.py`.)
I decided to give uv a shot on my new machine over pyenv and I've been enjoying it. Just last week I had to generate out 90 slides from some data last minute. Quickly created a project added in my dependencies (pandas, matplotlib, python-pptx), then crunched out some code. Absolutely zero friction with a much easier to use set of commands in my opinion.
Sure, I've basically replaced pyenv, pyenv-virtualenv, poetry; with uv.
I can't think about cons personally, though you might need to dig into the docs at times.
I don't know how complex your project is but I moved my previous work from pyenv to rye(UV and rye have merged, most work is being done on uv, today I'd probably use UV)
And am currently trying to move current work to UV. The problems seem to be possibility of unknown breakage for unknown users of the old project not any known technical issue.
I'd highly reccomend UV. Its just easier/more flexible. And it downloads third party pre compiled python builds instead of the extra time and complexity to get it compiling locally. Its much nicer especially when maintaing an environment for a team that just works without them having to know about it
One downside of UV is that unlike pyenv and rye it doesn't shim python. Pyenv shim did give me some trouble but rye simples shim didn't. The workaround is to run stuff with uv run x.py instead of python x.py
I worked in a large-ish org where 20+ python projects, their CI/CD pipelines and their docker images were migrated from `pyenv` + `.python-version` + `requirements.txt` to `uv` in basically a single day.
If you are comfortable with `pyenv`, the switch to `uv` is basically a walk in the park. The benefit is the speed + the predictable dependencies resolution.
I’m enjoying UV a lot as well. If anyone from the Astral team sees this, I’d love to request more functionality or examples around packaging native libraries.
At this point, just thinking about updating CIBuildWheel images triggers PTSD—the GitHub CI pipelines become unbearably slow, even for raw CPython bindings that don’t require LibC or PyBind11. It’s especially frustrating because Python is arguably the ultimate glue language for native libraries. If Astral’s tooling could streamline this part of the workflow, I think we’d see a significant boost in the pace of both development & adoption for native and hardware-accelerated tools.
Are people seeing it work well in GPU/pydata land and creating multiplatform docker images?
In the data science world, conda/mamba was needed because of this kind of thing, but a lot of room for improvement. We basically want lockfile, incremental+fast builds, and multi-arch for these tricky deps.
It works transparently. The lock file is cross-platform by default. When using pytorch, it automatically installs with MPS support on macOS and CUDA on Linux; everything just works. I can't speak for Windows, though.
I think the comparison for data work is more on conda, not poetry. afaict poetry is more about the "easier" case of pure-python, and not native areas like prebuilt platform-dependent binaries. Maybe poetry got better, but I typically see it more like a nice-to-have for local dev and rounding out the build, but not that recommended install flow for natively-aligned builds.
So still curious with folks navigating the 'harder' typical case of the pydata world, getting an improved option here is exciting!
If uv figures out a way to capture the scientific community by adding support for conda-forge that'll be the killshot for other similar projects, imo. Pixi is too half-baked currently and suffers from some questionable design decisions.
The key thing of conda-forge is that it's language (rust/go/c++/ruby/java/...) and platform (linux/macos/win/ppc64le/aarch64/...) agnostic rather than being python only.
If you want you can depend on a C++ and fortran compiler at runtime and (fairly) reliably expect it to work.
That's one of the bonus I was thinking about. It's nice if you have a subset of deps you want to share, or if one dep is actually part of the monorepo, but it does require more to know.
Thanks. Why is the notion of run and tool separate? Coming from JS, we have the package.json#scripts field and everything executes via a `pnpm run <script name>` command.
This is great news. I had hacked together some bash and fish scripts to mostly do this but they still had some rough edges. I missed that uv now had this ready for preview
The functionalities of three tooling projects, namely uv, ruff (linter), and pyright (type checker) need to merge and become mandatory for new Python projects. Together they will bring some limited sanity to Python.
In an ideal world they shouldn't have to, but in the real world, it makes it easier for enterprises to adopt without friction. Adopting three tools is a threefold bigger challenge in enterprises, but thinking about it as a single tool makes it more amenable to enterprise adoption where it's needed the most. The merging I suggest is only logical, more like a bundling.
Note that despite the title, the author is not switching from pyenv to uv, but from pip, pyenv, pipx, pip-tools, and pipdeptree to uv, because uv does much more than pyenv alone.
It replaces a whole stack, and does each feature better, faster, with fewer modes of failure.
What does uv offer over bog-standard setuptools, pip, pip-tools, and build?
Right now, the only thing I really want is dependency pinning in wheels but not pyproject.yaml, so I can pip install the source and get the latest and greatest, or I can pip install a wheel and get the frozen dependencies I used to build the wheel. Right now, if I want the second case, I have to publish the requirements.txt file and add the wheel to it, which works but is kind of awkward.
So... I am switching a project from pip to uv. I am hoping for things to be "better", but so far it's been a bit of a familiar "why does it not work as described?" journey.
Hi! I work on the Python distributions uv uses. Performance is really important to us and we're on the bleeding edge of performant Python builds. Our distributions use profile guided optimization (PGO) as well as post-link optimizations (BOLT). From my understanding, these are not enabled by pyenv by default because they significantly increase build times. It's possible there are some platform-specific build benefits, but I'd be surprised if it was significant.
I can set up some benchmarks comparing to pyenv on a couple common platforms – lately I've just been focused on benchmarking changes to CPython itself.
For what it's worth, I didn't notice a difference between my distro-provided Python 3.12 and the one I built from source - and enabling profile-guided optimization made only a slight difference. I haven't tested with the precompiled versions uv uses, so they could be slower for some reason, but I doubt it. On the other hand, my hardware is rather old, so maybe newer machines allow for significant optimizations that the system version wouldn't have. But I still kinda doubt it.
If performance is important to you, the ancient advice to profile bottlenecks and implement important parts in C where you can, still applies. Or you can try other implementation like PyPy.
You're being downvoted (for snark presumably) but you have a point.
During my tenures as a Python developer I've had to deal with pip, pipx, venv, pipenv, setuptools, conda, and poetry. I'd not heard of pyenv or uv until this thread (or maybe I've touched pyenv and got it confused with one of the 7 other tools I mentioned) and I'm sure there are other dependency/environment management tools floating around that I missed.
Now that I'm back to Go it's `go get` with some mise tasks. It's a serious breath of fresh air. The Python ecosystem probably won't ever catch up to npm when it comes to cranking out shiny new things but it's definitely more bleeding edge than a lot of other languages.
> I'd not heard of pyenv or uv until this thread (or maybe I've touched pyenv and got it confused with one of the 7 other tools I mentioned)
I must have seen at least a dozen threads here about uv since joining half a year ago. But maybe that's because I actively look for Python discussion (and I'm probably just generally more active here).
I wish I'd paid more attention a few years ago and thought about the packaging problem more intensely - in an alternate universe I might have released Paper before uv came out, but I didn't even really start considering these issues until mid-2023 (and of course I've had plenty of other things to distract me since then).
For what it's worth, my general thesis is that most of the problems really boil down to Pip being what it is, and a lot of the rest boil down to Setuptools being what it is.
In the past 10 years, virtualenv and pip have been perfectly fine for me. They still are. I ignored any new tooling.
uv is great so far, I did run into a hiccup where moving from
pip with a requirements.txt file to uv slowed a CI pipeline way down that I had to revert.
The reason there have been so many is because the standard included tools (pip, venv) are not great. And others could still use improvements.
Venv and setup tools aren't really package managers.
Pipx is only meant for installing Dev tools per user (in isolated Venvs).
pyenv does something a bit different from those tools you listed(maybe it'd part of cones I haven't tried it). Its not a dependency manager its a python version manager like nvm (node version manager).
It helps you manage downloading and compiling python from source and it let's you specify python version in a .python-version file and provides a shim to find the right python for a project(compiling it if its not already available).
I tried pipenv and despite being hyped for it, it had a lot of issues. Then I tried poetry which seemed much better but was still sort of slow and got stuck updating lock files sometimes.
I haven't even tried pdm. Or various conda package managers since its mainly used by scientists with lots of binary dependency needs.
Then ~~uv~~ rye came along and seemed to fix things. It replaced pip+pip tools/pipenv/poetry. Also replaced pipx(install python tools in isolated venvs then add it to users ./local/bin). Also replaced pyenv but instead of compiling python which takes a while and can be troublesome it downloads portable builds https://astral.sh/blog/python-build-standalone (which do have some downsides/compatibility issues but are usually better then compiling python).
It was also written in rust so avoided circular venv issues that sometimes come with installing python packages since it had a self contained binary(plus some shims).
Then UV came along, the projects merged and most development is happening in uv. Despite the rye-> switch most things are pretty similar and I feel a lot of excitement towards it. The one big difference is there's no shims to automatically call the right python for a project from UV. Instead you need to run uv run script.py
Astral the guys behind UV took over the independent python builds and have also built the most popular python formater/linter these days - ruff (also written in rust, also fast they're also looking into adding a better type checker for python type hints).
I'd reccomend trying it for your next project I think it could become the defacto packaging/version tool for python
`venv` is fine. The work of just creating the virtual environment is hardly anything, and `venv` really can't screw it up. If you create environments `--without-pip`, it's actually faster than `virtualenv` and `uv venv` in my testing (because those are fundamentally doing the same thing with a little extra window dressing). What slows it down is bootstrapping Pip into the new environment, via the standard library `ensurepip`, which requires running zipped un-bytecode-compiled code from a bundled wheel.
(As it happens, this is the exact topic of the blog post I'm currently in the middle of writing.)
Pip is indeed not great (understatement - there are many other things about it that I have picked on or will pick on in this series).
>Venv and setup tools aren't really package managers.
Setuptools nowadays is a build backend. Long ago (when expectations were much lower), Pip had Setuptools as a dependency, and the combination was about as close to a "package manager" as anyone really cared for. But Pip today really can't be called anything like a "package manager" either - it only installs the packages, and resolves dependencies. It records some basic information about the dependency graph of the things it installed - in the sense that it preserves such information from the wheels it installs - but it doesn't really do any processing of that information. If you ask it to remove packages it's (last I checked) not very smart about that. It doesn't help you figure out what packages you need, and it doesn't help you maintain your `pyproject.toml` file.
And, of course, it doesn't create or keep track of virtual environments for you. (Pipx does the latter, wrapping Pip and `venv`, but it only installs "application" packages that define an entry point.)
Poetry and PDM are the only things listed that really belong to the same category as uv. They're not only package managers, but complete workflow tools. (Conda is a package manager, for languages other than Python as well, but it's not meant to take over your entire Python workflow in the same way.) They even wrap themselves around the process of uploading to PyPI (which seems really excessive to me; seriously, `twine` is fine too.)
I've used 'em all, pip + virtualenv, conda (and all its variants), Poetry, PDM (my personal favorite before switching to uv). Uv handles everything I need in a way that makes it so I don't have to reach for other tools, or really even think about what uv is doing. It just works, and it works great.
I even use it for small scripts. You can run "uv init --script <script_name.py>" and then "uv add package1 package2 package3 --script <script_name.py>". This adds an oddly formatted comment to the top of the script and instructs uv which packages to install when you run it. The first time you run "uv run <script_name.py>," uv installs everything you need and executes the script. Subsequent executions use the cached dependencies so it starts immediately.
If you're going to ask me to pitch you on why it's better than your current preference, I'm not going to do that. Uv is very easy to install & test, I really recommend giving it a try on your next script or pet project!
reply