Hacker News new | past | comments | ask | show | jobs | submit login
Uv, a fast Python package and project manager (astral.sh)
185 points by Loic 3 days ago | hide | past | favorite | 145 comments





Uv has been awesome so far. It brought the release time for Home Assistant down from an hour and a half to about 15 minutes.

Also installing all the dependencies from scratch used to give you enough time to grab lunch and drink coffee, and now we can only grab the coffee in that time!


Unfortunately the implementation in HA also broke a lot of addons from HACS for many people running HA in a container.

Classic software problem. "This thing is so great, its never been faster. Now if only it didn't break everything for my users"

An alternative framing. "This software could be so fast, but it's bogged down by having to support every single workflow it's ever once even accidentally supported."

From the top of my head.

* Microsoft Windows

* Microsoft Excel

* Python/Pandas

* Web browsers

They all do great things and deserve all the praise for maintaining backwards compatibility, havoc would have ensued otherwise.

Does not mean that they should not be replaced though.


Someone needs to collect the data on UV's rise to popularity and do a sociological study, because from where I sit[0], UV just suddenly came out of nowhere, and despite being confusingly named[1] and backed by a for-profit company, has already captured the hearts and minds of Python developers. Between this and other tooling from that same company, it feels to me that, in the space of less than a year, Python has turned from a community language into a company-led one (like. e.g. Scala).

Not saying whether it's good or bad - just that it's damn curious how this played out. It feels like Python leadership got taken over by some company almost overnight; I'd love to understand how did this happen.

--

[0] - Arguably an outsider to Python community, but like everyone else, downstream of it, and always affected by it, ever since Python became the de-facto standard in Linux as the language between C and Bash.

[1] - Anyone remembers libuv? Spearheading the whole async/await paradigm by powering it in Node.JS, it's one of the most known and talked about libraries ever.


It doesn't feel too surprising to me. The core developer tooling for Python was (/is) about on par with some of the other top languages (C, C++, Java), but lacking when compared to some of the others (Rust, Go, JS/TS). There have also been previous attempts (Pipenv/Poetry) that did prove demand but fell flat in some regards. So it feels like there wasn't any magic, trickery or "take over", but they just filled a void.

Control over uv and ruff (and now also python-build-standalone) doesn't really give Astral significant leverage over Python as a language. They are also quite involved in standardizing many parts of uv via PEPs, which I'm sure the PSF could shoot down if they were overly favoring uv's goals.

Compared to other languages Python leadership also always made the impression to me that they'd rather focus on the language itself and have a looser grip on the ecosystem surrounding it even if that comes with a more fragmented ecosystem (which has already been the status quo for most of Python's existence). I think that makes it very hard to compare to e.g. when Yarn entered the Node.js package manager scene.

It also doesn't feel too surprising that there is little resistance against commercial involvement. Commercially-led programming languages were quite the norm in the past and also seem to be having a comeback (or trying to). In some languages where your median developer has more years of experience and they likely grew up with languages which are more "grassroots" the resistance would be higher. And while I would assume that the people at the PSF also fit that profile, in a high-influx language such as Python, the majority of the Python developers do not.


>There have also been previous attempts (Pipenv/Poetry) that did prove demand but fell flat in some regards.

In my book, this happened partly because of disagreement over what tasks the tooling actually needs to do (and because users' needs differ wildly), and a bit because of slow discussions around standards and development of those standards - but mostly because of a poor foundation. It's not that Uv's installer is doing amazing things; it's more that Pip is just badly designed. The internals aren't well factored; it vendors a whole bunch of stuff; it can't offer a usable UI for its own download cache; it doesn't offer an API (and explicitly counsels people to run it in a subprocess); it's designed around the assumption that it will get copied into every virtual environment (even though you don't have to do that any more - except that people don't know this, and it could break if your programs are doing that subprocess trick, unless they declared Pip as a dependency explicitly because who does that?)... the list goes on and on.

> Compared to other languages Python leadership also always made the impression to me that they'd rather focus on the language itself and have a looser grip on the ecosystem surrounding it even if that comes with a more fragmented ecosystem (which has already been the status quo for most of Python's existence)

Python has been around much longer than programmers have been talking about "software ecosystems" in the contemporary sense. Pip isn't in the standard library, so that it can be developed and versioned separately. Making the experience smooth (to the extent that it is) requires special-case support in the form of the `ensurepip` bootstrap and `venv` special-casing Pip (for a while, Setuptools got the same treatment).

There is an extremely strong insistence in that community on making sure not to break older workflows - as much as people perceive them to get broken all the time anyway. All manner of improvements are held back because we still need to support projects that use `setup.py` exclusively and do so in a variety of legacy ways; and because Python still has to work if you don't set up a "project" at all; and shebangs have to work which has implications for how `sys.path` is initialized, etc. The last time there was a concerted attempt to break everything in order to rebuild it properly, users ended up getting 11 years to adapt to 3.x (9, if you account for 3.0 and 3.1 having serious issues and going back on a decision about how to migrate bytes and unicode literals) and in some cases are still grumbling almost 5 more years past the nominal deadline.

>In some languages where your median developer has more years of experience and they likely grew up with languages which are more "grassroots" the resistance would be higher. And while I would assume that the people at the PSF also fit that profile, in a high-influx language such as Python, the majority of the Python developers do not.

The developers who have been around the longest seem to have more of that background. The best-known names among the relative newcomers all seem to be people working at large tech companies.


As a pretty minor note, it's not that we _needed_ a two letter name. We considered hundreds of names and it really came down to what was available on PyPI (because it was table stakes that it could be installed from there). We are aware of libuv — it's used in Python too. However, it's a totally different domain and the name conflict isn't really a problem for our users.

Frankly, I'm just happy we didn't land on yet another packaging acronym or snake species :)


>However, it's a totally different domain and the name conflict isn't really a problem for our users.

This is a poor rationalization for violating norms in open source software. No package manager needs to have a name conflict with a popular package or any package for that matter.


> Someone needs to collect the data on UV's rise to popularity and do a sociological study, because from where I sit[0], UV just suddenly came out of nowhere, and despite being confusingly named[1] and backed by a for-profit company, has already captured the hearts and minds of Python developers.

This happens in the Node/React community literally once or more times a year, haha.


Because uv is excellent in some ways and good in all ways. Pretty much all other package managers suck in some way.

> confusingly named

The name needs to be ultra-short, there are only so many 2 and 3 letter words which are not already taken on linux/windows.

> Python has turned from a community language into a company-led one

That's quite a radical take, reducing a language to it's package manager and linter.

> backed by a for-profit company

so is VS Code


> The name needs to be ultra-short, there are only so many 2 and 3 letter words which are not already taken on linux/windows.

So? 3 letters isn't some magic cutoff; cutting your command down to 3 letters or less doesn't make it suddenly 10x more effective for its users. 4 or 5 letters would be fine, too.

> That's quite a radical take, reducing a language to it's package manager and linter.

Package manager and linter are the most fundamental tools to a programming language in a broader sense, second only to the reference compiler/implementation. The trend in the last couple decades is to, in fact, make all three part of the core, under purview of the same person or group that designs the language itself, and its standard library.

I.e. if you want to "own" the future of a language, the second best option after owning the most popular runtime, is to own the most popular package manager. And I can see this happening already in discussion threads about Python I've been reading over past few months. It's not just that people are excited about UV itself, they're treating Astral as a "thought leader" now.

Again, I'm not saying it's bad, or that Astral would be a bad steward. I'm just surprised to see a previously diverse/anarchic language community to suddenly feel like it'd welcome such stewardship.

> so is VS Code

VS Code is its own thing; sure, it sucked oxygen out of IDE space on UI front, but it also competes against JetBrains in the same space, plus, thanks to LSP, it revitalized Vim and Emacs as viable competitors and enabled creation of even more editors.

In contrast, UV just put a single company in the core position to influence the direction Python as a language takes, which is a new situation for Python - one of the last few popular languages that didn't have a corporate benefactor to start and push it into popularity.


> UV just put a single company in the core position to influence the direction Python as a language takes

I would argue VS Code where the most popular extension is for the Python language has more influence than Astral.


>Package manager and linter are the most fundamental tools to a programming language in a broader sense, second only to the reference compiler/implementation.... I'm just surprised to see a previously diverse/anarchic language community to suddenly feel like it'd welcome such stewardship.

This seems to represent a schism in the community, although not a constantly explicit one that's pointed at as the cause of conflict (like with, say, the debate over the role of Rust in the Linux kernel).

The original community is so "diverse/anarchic" - and so old - that it causes friction with that first assertion. As far as I know, the idea of a "package manager" being particular to a programming language (as opposed to the operating system) is much newer than Python itself. I was able to use Python for many years before anyone ever suggested a linter to me, either. (Granted, the original lint program for C is from the 70s, but it wasn't seen as a necessity back then - not a compulsory part of a software engineering education, and certainly not treated as fundamental. Actually, I still don't use one.) And I'm still nowhere near "original", having really started using Python in the mid 2000s.

In that frame, it's hard to get people to agree on a project manager, because they don't even agree on the concept of a "project". People write code that they never plan to share with others. People write code that they only expect to share with real-life associates, and don't have the tools to understand why it "only works on my machine" - if you don't have the mental concept of dependencies, you aren't going to be able to specify them, nor properly use a tool to manage them. People just write a few `.py` files in a directory somewhere and expect them to play nice with each other and don't understand why they're shadowing the standard library.

On the other end of the spectrum, people need to interface Python with code in C, FORTRAN and all sorts of other dinosaurs, and they still don't yet have a way to specify their dependencies properly. (Before 2018 or so, there wasn't even a clearly delineated concept of build-time vs runtime dependencies, and people would get into weird chicken-and-egg situations because of needing to run `setup.py` in order to find out what packages are required to run `setup.py`. And now they still run into those problems, because they resist the new changes because of the "useless churn" of `pyproject.toml`.) Sometimes you can hack around that because wheel contents can be anything (including compiled C libraries), but Pip still isn't going to install a custom C compiler into `/usr/bin` (or, heaven forbid, set one up on Windows). And then you have people with giant monorepos trying to figure out where to put `pyproject.toml` files to describe separate "projects" within the same superstructure and build all the separate wheels they want.


Things i love about uv/astral

- Putting .venv automatically into the project directory

- Installing dependencies with pyproject.toml

- Installing any Python version

- Fast pip installs

- uvx (like npx)

- Ruff formatter and linter, made by the same people

Python used to give me a headache, now I tend to reach for it more often, exclusively thanks to uv


I only don't like the .venv in the project directory. It tends to get copied into git or docker. And it makes backup of a project a lot bigger and taking longer because of the many files.

.gitignore and .dockerignore exist.

Uv doesn't require .venv in the project directory, it's just the default. If you have a good reason why it doesn't work for your setup, you can place the venv dir anywhere and still use uv.

I don't know what's gone wrong with your Git setup (and I don't use Docker), but dotfiles (thus, a `.venv` folder) should be getting gitignored by default. (If all else fails, you can just add a `.gitignore` with just `*` to the venv.)

There's this weird disconnect I've noticed whenever people talk about Python project management tools in general, and especially about UV.

When they talk about why they like the tool, they say things like "it's not itself written in Python" (sometimes they even seem to think Rust specifically is important), and "it's an all-in-one integrated tool that takes care of all my needs".

But when they talk about what they like about the tool, it's stuff that doesn't actually depend on the prior facts.

People have lots of different ideas about where venvs should go. Choosing an entire tool suite because it agrees with you on this point seems rather sketchy to me. (Obviously you specifically have other reasons, but still.) Managing venvs isn't hard. I use a couple of shell functions in my .bash_aliases to do it the way I want:

    # Activate the local venv, which is specially named/located.
    alias activate-local="source .local/.venv/bin/activate"

    # Run an acceptance test in the local private folder.
    alias try-it="(cd .local/ && source run-acceptance-test)"

    # PIP (the pipx-installed copy) in the current Environment.
    pipe() {
        if [ -z ${VIRTUAL_ENV+x} ]
        then
            echo "No venv active; use pip instead"
        else
            ~/.local/bin/pip --python `which python` "$@"
        fi
    }
And then I use a simple Python wrapper to create the venv and do an editable install into it. ("Installing dependencies with pyproject.toml" is already supported in Pip, assuming you mean the project's runtime dependencies: `pip install -e .`. Support for other dependency groups defined by new PEPs will hopefully come soon.)

There's no reason Ruff couldn't be a separate tool, mixed and matched with the tools used for other parts of the development process. That's already how people used Black.

Similarly for most of the other jobs. In my book, prefixing `uv tool` to a command line is noise; I don't see how it's better than having separate tools. What you really get from a tool suite is an opinion about which to use.

The issues with Pip don't result from it being written in Python. They result from internal design flaws - partly just generic technical debt, but largely the long-standing assumption that you'll just copy Pip to each environment. (There's code out there that tries to use `subprocess` to run Pip even after the wheel is nominally installed; this isn't assured to work! There's also code that tries to use Pip's API, even though that doesn't actually exist and will break without warning or documentation.) Current versions of Pip have much better support for installing into a different environment from where it's located (which is part of what makes Pipx possible), but awareness is low, and `--without-pip` is not the default for `venv` (that would be really disruptive).

Even slow package resolution is, as far as I can tell, mostly not because of Pip's Python code running slowly. It has more to do with suboptimal caching behaviours and the fact that the metadata standards are so poorly done, and perhaps just some algorithmic issues. But a ton of that work is IO-bound, especially if you have to use sdists for anything (getting accurate metadata tends to involve downloading the entire sdist and building the project, even if it turns out not to be the version that should be installed).


I concur.

Python was essentially dead for me outside of simple scripts, because of constant friction: pip, pyenv, poetry, pyproject.toml, rye - wtf? Now it's python3 -m pip3 install --user uv; uv init; uv add <package> and I'm good to go. Or amazing uvx <package>. Finally someone understands how it is supposed to be done.


Even better, use mise to install uv and all your other tooling

Hey a fellow mise user in the wild! I use it everywhere now for both personal and work projects. Cheers!

As opposed to:

`python3 -m pip install <package>' to install it?

And `python3 -m <package>' to run it?


'python3 -m pip install <package>' to install it?

Won't update your pyproject.toml and lock file

'python3 -m <package>' to run it?

Won't install the dependencies you need or set up an insolated environment <package>. 'uvx <package>' will create a venv, make sure you have all the dependencies that <package> needs and run <package>


You forgot to initialize venv and activate it. And now also do it on other dev's machine shared via git.

They are one-offs, much like your init command. You do not have to activate it: you can explicitly invoke the binary in the venv if you do not want this.

I spawn dozens of one-offs, cost adds up. Some of those evolve to not be one-offs, and then synchronization with other devs becomes a problem.

You still use it only once per project in normal workflows. And if you need to go beyond that, you'd have the same problem with Rust, at a minimum; it can't read your mind and install into the correct venv. It only seems to be doing that because it made one and assumes it will use that one.

If you like the per-project venv workflow, and like having it in a specific place (whether that's relative to the project root, or a parallel named venv in `~/.local` somewhere, or anything else), you can set that up as part of your project initialization, too.

Going back to your first comment:

> because of constant friction: pip, pyenv, poetry, pyproject.toml, rye - wtf?

This complaint makes little sense because it mixes and matches things from different categories. `pyproject.toml` isn't even a tool, but a config file that several other tools will rely on - including uv. Poetry and Rye are specifically trying to do the same thing uv is - in fact, Rye is uv's predecessor. Pyenv is for making separate installations of Python - it works alongside venvs, not as an alternative. Pip is just an installer; it's not meant to be compared to full project workflow tools or even to just "package managers".

Expecting one tool to work for everyone is a mistake because of the sheer variety of use cases out there. At the very least, people who only want to install Python applications and libraries are very different from those who seek to distribute their own work. (People in the first group might not even write Python code!)


As wise Red from the Shawshank Redemption said:

https://tenor.com/4bSA.gif

> This complaint makes little sense because it mixes and matches things from different categories.

It makes all the sense, because I don’t want or have to give a shit about all of this crap.

> `pyproject.toml` isn't even a tool, but a config file

And how many of current tools understand it? Can I make pip save dependencies in it out of the box? No? Then it stays in bullshit list. Arguably it became much better these days, but I remember constant friction when pyproject.toml was just introduced.

> Poetry and Rye are specifically trying to do the same thing uv is - in fact, Rye is uv's predecessor. Pyenv is for making separate installations of Python - it works alongside venvs, not as an alternative. Pip is just an installer; it's not meant to be compared to full project workflow tools or even to just "package managers". Expecting one tool to work for everyone is a mistake because of the sheer variety of use cases out there. At the very least, people who only want to install Python applications and libraries are very different from those who seek to distribute their own work. (People in the first group might not even write Python code!)

Mistake is having thousands of disjoint tools that are stitched together with sticks and shit. I’m glad most modern programming communities got fed up with this crap and lean towards new generation of tooling that can do everything.

I’m absolutely fed up and loathe stitching together billions of half working shite to get semi decent working environment.

Environment management, testing, formatting, libraries and building in one tool should be a minimum requirement.


Environment management, testing, formatting, libraries and building in one tool should be a minimum requirement.

Poetry kind of tried that, and in doing so failed to allow for a lot of uncommon but vital corner cases. In fact the main reason I gave up on poetry and switched to uv is that uv is a lot less opinionated and happily handles my 'weird' setup and lets me plug in the tools that best serve me rather than requiring me to use it for everything. I have no doubt that uv (or at least the uv 'family') will grow to handle everything sooner or later, but by tackling one piece at a time and making the tools more or less independent also gives flexibility.


Completely agree. What sold uv to me is ease of use. I never got that with Poetry.

Never have I used "python -m" to run pip. Never in 20 years of using Python. I think you're imagining problems to gripe more about pip. Not that pip is perfect but we truly don't need anything else. It could turn out nice but I think having to learn Rust will turn off most people in the Python community when it comes to contributing.

Pip is a nightmare to use, compared to npm, and the only thing that caught your eye is “python3 -m pip” instead of pip3? Seriously?

I never had much trouble using pip. I only pointed out that gripe as evidence that the dude saying it probably isn't using pip or else he is extremely eccentric in a way that would cause issues eventually.

If I was the Python BDFL I would just merge this into pip and solve Python packaging forever. Only decisive action will fix package tooling fragmentation.

Disagree. That seems like it would just slow down progress.

Astral are winning mindshare because the tooling is so much better than the previous generations. That’s the action that’s required here - to be the compelling choice, rather than another choice that’s pretty much like the others and differs only a little.


It's worth remembering here the incident that led to Conda existing in the first place. There has never been a culture that would allow for this. And keep in mind that, aside from being a completely incompatible project, Pip is not part of the standard library.

People don't agree that "package tooling fragmentation" is an actual problem, anyway. I'm one of them. The real problem is that a few of the basic tools (mainly, Pip and Setuptools) don't work properly. They have fundamental design flaws and are ridden with technical debt. The underlying implemented standards have also lagged behind, and Uv can't actually fix that.

But once a solid base is established, I don't actually want to have a single tool making all the package management decisions for me. I want Unix-philosophy tools that handle specific individual development tasks, built around a proper, integrated user tool for installing applications and libraries (Pipx would be very close to such a tool, if it exposed its underlying Pip copy more elegantly and if Pip didn't suck). I don't want someone else's "project manager" to decide for me what build backend to use, or even to just bundle one. And I'm perfectly happy using `build` as a build frontend, `twine` as a PyPI uploader etc. Writing `uv upload` or whatever it actually uses, is not an improvement to me.


People also said the same about poetry and pipevn (and no doubt some other tools I'm forgetting).

Neither of which became the canonical way of installing packages mostly because the Python Software Foundation does not want to endorse anything in particular.

Have they figured out yet how this VC backed company will make money? It's quite important imo, I don't want a watered down experience 5 years from now.

Charlie is on record stating that their goal is to sell value added pieces to their tooling and keep the core tools free and open.

Pydantic had a nice model where the open source tool is fantastic, but they now sell a cloud based logging system around that. There will be some enterprise tooling around UV that they could sell while keeping the tool itself free

It sounds like they're not yet at the stage where they need to worry about it, though I've heard Charlie mention making an easy to host package registry as one offering.

I would pay if they could make our GitHub CI run 20-30% faster.

For me it is not great because of the new requirement for "rust" for basic Python. Does not make any sense.

Especially if the main selling point is "speed". I never encounter cases where the pip install equivalent was particularly unbearably long except in very badly designed and broken projects. The kind of one that that import every possible dependency (nom style) with pinning things on broken and incompatible set of dependencies...

Also, I like very much the concept of one tool for one usage. And having venv and pip is great, having one tool cluster mixing everything is not. Sure in most dummy usage cases it will be great, but you have more chance of clusterfucks and complicated unresolvable issues. Even more if developers are opinionated on a single way to do something.


> For me it is not great because of the new requirement for "rust" for basic Python. Does not make any sense.

The language of choice for such core functionality in the past was C. I'm not sure how much spelunking you did in the core interpreter, but the Astral rust code is much easier to understand and modify.


But C did not disappear, now you need C and Rust.

If Python was based on Rust, I guess I would not mind, but it is not. Also C is also supported everywhere! And personally I don't see why Rust would be more easy to understand than C. C is really straightforward compared to Rust.

All of that being said, for something as basic and important as a package manager I would except that it is implemented in pure Python so it might easily be hack, modified or fixed without requiring any compilation... Like most of pip and setup tools were.

Just wait for the moment that you can't update some libraries in your code because UV will not work without being updated, but an updated version binary is not available or you need to fix/backport something, so you need to build it. But unfortunately rust devs will not care supported older systems/os, so you are fucked. And you need to upgrade your whole system just to be able to update a simple application library.


Using `uv` for Python is the first time I have used Python and enjoyed the experience full stop.

`ruff` is also just straight up amazing.

Everything else should do us all a favor and deprecate itself tomorrow.


I’ve been missing a python tool like this that isn’t written in python. Conda was good in this regard - having a distribution without first installing python is a massive improvement to the «getting started» problems in python.

I've really never understood this objection. Users that need multiple actual versions of Python (as opposed to separate venvs) are usually more technically sophisticated; Python for Windows comes as an installer and Linux is generally dependent on it already (and a usable dev environment is only `python -m venv` away). But more to the point, writing an application in another language - or at all - isn't relevant to solving that problem. There's no reason you couldn't have a downloadable freestanding Pip package that comes with its own Python - just have the user unpack a zip archive or whatever, and have all the needed files in the right place. That just doesn't happen because it seems backwards.

There's been an initiative to standardize portable Python binaries that would facilitate that sort of approach (and make them installable from PyPI), called PyBI (https://peps.python.org/pep-0711/). Yes, you still have to do that installation. But at least in principle, you don't need an installer for it; you could download from the PyPI website and unzip the wheel.


You’re mostly correct. However in practice this has failed to materialize. The issue which I think confuses people is the conflation between the python environment of the packaging tool and the environment you should use for your code. Even anaconda python used to make this mistake, if I recall correctly.

>The issue which I think confuses people is the conflation between the python environment of the packaging tool and the environment you should use for your code.

This is primarily Pip's fault, because it spent many years championing a model where it would be copied into each new "environment you should use for your code".

Most of the benefits people attribute to Python tools because they're "written in a language other than Python", are really just benefits owing to the tool being isolated from the dev environment. And for about the last two years, this has been reasonable with Pip (possible but unreasonably complicated before; now it uses a weird hack, but the experience is pretty smooth). People just don't know about it, in part because they aren't forcefully confronted with it (because copying Pip is still the default for the `venv` standard library module).

It's noteworthy that the standard library does contain `venv` - but not more complex things; technically not even Pip. This rhymes strongly with the philosophy that designed the rest of the standard library. It's the same reason we don't have Requests in there, or, say, `python-dateutil`. (Actually, boto3 gets more than twice as many downloads as Requests, but it's hard to understand why. Working with AWS is surely far more niche than making HTTP connections.)


True. Although if a tool is written in python, it can easily be installed with pip install. I’ve seen so many people get confused by doing pip install poetry in way too many places resulting in a very confusing work environment by having different poetry version and setup depending on which directory you’re in / poetry shell you’ve loaded. If you cannot pip install the tool, this particular madness goes away.

Requests: there’s urllib in the stdlib. It’s requests just worse.

Working with aws means you’re automating things, ci/cd etc. Many people use python they’ve installed locally for months or years without upgrading.


>I’ve seen so many people get confused by doing pip install poetry in way too many places resulting in a very confusing work environment by having different poetry version and setup depending on which directory you’re in / poetry shell you’ve loaded.

This is not actually an error that would have occurred to me to make - since one finds out about Poetry from its webpage, which recommends all sorts of custom installation procedures instead.

That's actually a big part of why I gave up on Poetry: the officially blessed install procedure changed a few times, and it became difficult to uninstall because things installed the old way would expect to be uninstalled the old way as well, but that way was already deprecated. Well, I don't remember the exact details, but it was generally that sort of experience.

(Actually, you gave me some insight for something I should do in the installer I'm designing....)

>Working with aws means you’re automating things, ci/cd etc. Many people use python they’ve installed locally for months or years without upgrading.

Yeah. The download count for boto3 is really absurd, though - something like twice a year per person on Earth (https://pypistats.org/top). I guess that has a fair bit to do with them apparently putting out a patch release every business day.


Welcome to the bootstrapping problem. The original take is "how do I build a C compiler without a C compiler", and some amazingly gifted people from the GNU/Guix team managed to take it to its logical conclusion - and wrote a 357 byte piece of annotated machine code, from which an entire distro can be bootstrapped.

On the bright side, uv is written in Rust, which makes distributing prebuilt releases practical, and helps end-users getting started. OTOH Linux+GNU+GCC pale in comparison to Rust's own bootstrapping problem - each Rust compiler is written in some previous release/dialect of Rust, all the way back to the pre-1.0 days, when it was written in an obscure dialect of OCaml. There are efforts underway to make Rust bootstrappable, but as of right now the Python ecosystem might be painting itself into a corner with slowly making Rust a hard dependency.

https://bootstrappable.org


>On the bright side, uv is written in Rust, which makes distributing prebuilt releases practical, and helps end-users getting started. OTOH Linux+GNU+GCC pale in comparison to Rust's own bootstrapping problem - each Rust compiler is written in some previous release/dialect of Rust, all the way back to the pre-1.0 days, when it was written in an obscure dialect of OCaml. There are efforts underway to make Rust bootstrappable, but as of right now the Python ecosystem might be painting itself into a corner with slowly making Rust a hard dependency.

Thankfully the Python ecosystem probably isn't going to adopt this crazy Rust dependency. The Rust nuts are unbelievably and laughably persistent but this is an overcomplicated solution that nobody asked for. You could probably say the same about Rust in the Linux kernel so who knows...


Rust-the-language is fine. Rust-the-blessed-toolchain is the problem. In practical terms, it delivers enormous value, it's about the problems that lay down the road.

Once Rust can be easily bootstrapped? Like you can build the official Go toolchain from 1.4?

I'd argue Rust in Linux is less of a problem than Linux itself growing out of bounds. Compare seccomp with OpenBSD's pledge/unveil. But there's a reason why Linux is winning by a margin of 100.000x: it's simply more practical, even if less simple.


I don't think Rust delivers this tremendous value. Bootstrapping it doesn't do much unless everything gets migrated forward to the bootstrappable version. It's got all the worst aspects of a systems language combined with all the pain points of npm. I don't want to be required to be online or to download hundreds of unvetted bullshit packages to make a simple program, and that's the way it's made.

Not quite sure what kind of projects people are talking about here, where Poetry takes too long. Must be pretty massive sets of dependencies. Maybe huuuge monoliths? What I usually do is build a venv in a docker container of a service and then that gets deployed. I don't see a problem, if this took a minute or two, but so far it never did take that long.

I'd say there are two things that make me prefer rye/uv.

The speed is one — sometimes I'm in the middle of a project and realise I need to pull in another dependency. Keeping the disruption to flow at a minimum is helpful in those cases. So it's not about deployed performance, it's about mental friction.

The other is I've found Poetry liable to breaking in obscure ways. I don't fully control everything on my work machine, which I suspect is the issue. So far, rye and uv have been smooth, but that might be down to me not having used them long enough.


If you do anything ML or neural networks, suddenly you have hundreds of packages totaling over 1 GB.

Yes, huge monoliths

The answer to the question about VC funding is weirdly obvious at any sort of distance away from the emotive things.

Communities should probably cautiously welcome VC funding for well-integrated community members solving hard, core problems with correctly-licensed contributions back to the community, as long as that effort doesn't divert community engineering focus away from long-term community goals and towards the startup's goals in non-beneficial way.

They should be more cynical about things that don't happen this way, but then they can keep doing their own thing anyway.

Ondsel found the money for two major core technical problems to be addressed in FreeCAD, and seems to have had a pretty positive impact on release-oriented thinking in general. They then folded. It is a loss, but essentially none of the work was lost.

Frankly I am not sure if the "written in Rust" element here is culturally beneficial in the long term or not; I don't know enough about Python or really anything about Rust.


Please give me commands to create a virtual environment built with libraries listed in the requirements.txt of a GitHub repo.

Ubuntu here.


For comparison, with bare-bones tools (assuming `pip` is a globally-visible Pip version 22.3[0] or better):

    $ python -m venv --without-pip .venv
    $ pip --python .venv/bin/python install -r https://raw.githubusercontent.com/astral-sh/uv/refs/heads/main/docs/requirements.txt
This venv creation is instantaneous, because it doesn't install Pip in the new venv. On my 10-year-old machine, with cached packages (it still has to unzip everything and move it around), the installation takes about 8.5 seconds.

I have some wrapper scripts defined locally to smooth out this process; I can do `make-project-venv` (which puts the venv in `.local/.venv` and also attempts to install the current project with `pip --python ... install -e .`) and then `activate-local` and then `pipe install ...`.

[0]: https://pip.pypa.io/en/stable/news/#v22-3


e.g.

  $ uv venv
  Using CPython 3.12.6
  Creating virtual environment at: .venv
  Activate with: source .venv/bin/activate
  $ uv pip install -r https://raw.githubusercontent.com/astral-sh/uv/refs/heads/main/docs/requirements.txt
  Resolved 43 packages in 283ms
  Prepared 4 packages in 64ms
  Installed 43 packages in 445ms
  ...

Happy user so far in my early days ... appreciate the speed for sure

So, why should I switch to this and what is going to stop some other tool becoming the "python package/env management darling" in 2025?

The way I feel is that if another tool comes along that I like better, I’ll switch again. Or if I prefer the stability for some reason, I won’t.

Either way, I don’t see any need to stop people from building better things.


I switched to uv recently from poetry because uv manages the python version too.

Have you tried ruff for linting? If you have joy of using it, you should try uv. It’s the same feeling

I have noticed a spike in attention to UV since Anthropic announced the Model Context Protocol (MCP). After using it for MCP development, I am moving from pyenv to uv! https://www.anthropic.com/news/model-context-protocol

Super fast installs - when it works. I had problems on Ubuntu for linux (WSL) with a clang dependency, which i could never solve. But on linux boxes, it's so much better than pip/conda/etc.

I also like that i can use it with conda environments. Just create conda env, then install with 'uv pip install ....'


> I also like that i can use it with conda environments. Just create conda env, then install with 'uv pip install ....'

Huh cool, I'll have to try that. Conda is great for getting the dependencies for non python code (geos/gdal/cuda etc) but uv is super fast and straightforward (makes building packages much, much easier).


same company has a different tool as well called pixi which aims for much nicer integration with the conda ecosystem. Also uses uv under the hood so speed is comparable.

https://github.com/prefix-dev/pixi


Have you tried pixi? I haven’t, and i’m hoping i can continue to avoid conda, but it’s a uv-using conda replacement: https://pixi.sh/latest/

So now you're using not one (conda); not two (conda and uv), but three (conda, uv and pip) different package managers to manage your Python environment?

"uv pip" is a ground up implementation of python pip and doesn't use the python pip command in any way. So still only two package managers.

And the point is that you always have to use two package managers when developing python. One that installs python and various non-python libraries and dev tools, and one that installs python libraries. Conda in this scenario is replacing apt as much as anything else.


I have only ever really used venv. Poetry was fine but didn't give me any additional benefits that I could see. What does this offer? And more broadly, why do people consider pip to be a problem? I have literally never had any issues with it in any of my projects.

Personally, I use it for everything right now. It's faster to do `uv init` and then add your dependencies with `uv add` and than just `uv run <whatever>`. You can argue that poetry does the same, but `uv` also has a pipx alternative, which I find myself using more than the package manager that my distro offers, since I never had compatibility issues with packages.

The main advantage these tools (poetry, pipenv, uv, ...) offer is, that they let you create lock files, which make your venv reproducible. Without that, you are kind of living in the wild west, where tomorrow you project can break. These tools help with projects, that are supposed to do more than "runs on my machine".

Maybe I'm not fully grasping lock files then, but why isn't it sufficient to just pin the versions in the requirements.txt? Obviously it doesn't handle the python version itself but I just use pyenv for that. So my stance is just pyenv + venv seems to solve those problems. But then I see people singing the praises of these newer tools and I wonder what I am not getting

I was in the same position myself and had to learn the answers myself. It still doesn't really matter very much for me - I do more library than application development, and my applications would probably generally be fine with a wide range of dependency versions - if they even have dependencies. In short, different people have very different use cases, and you might simply not have the use cases that drive so many others to the tools in question. But it's still useful to understand the theory.

>why isn't it sufficient to just pin the versions in the requirements.txt?

Because your dependencies have dependencies.

You can, in fact, pin those as well explicitly, and as long as what you pin is a valid solution, Pip will (to my understanding; I haven't done a thorough, explicit test) happily grab exactly what you asked for. And as long as the version number is enough information, that will work for your application.

But some people also want to ensure they use specific exact builds of a package, and verify their hashes. Some of them might be using private indexes (or even mixing and matching with PyPI) and need to worry about supply chain attacks. And at any rate they don't want to account for all the transitive dependencies manually.

In principle, a lock file is a record of such a solution, including all the transitive dependencies, plus file hashes, etc. There isn't perfect agreement about what needs to be in them, which is why discussion of a standard lock file format has been tried a few times and is still ongoing (the current effort is https://peps.python.org/pep-0751/ ; see the linked threads for just some of the related discussion of the concept - there is much more, in the abstract).


Look into concrete vs abstract dependencies:

https://martin-thoma.com/python-requirements/


Just to give a concrete example, it helped me with my projects involving langchain and several langchain extensions. I have a clear and consistent source of version record for each dependency.

> create lock files

That's where pip-tools comes in.


And you can use uv as a full replacement for that with less bugs, faster performance and just generally a more pleasant experience. pip-compile -> uv pip compile and pip-sync -> uv pip sync.

(Though I think the high level interface is the better thing to use)


No disagreement here, just pointing out that it's also possible to do with existing tools.

"existing tools" -- What do you mean by that? pip-tools [1] seem to be just another PyPI package. What makes them more available than any of the other tools? The other tools "exist" as well.

[1]: https://pypi.org/project/pip-tools/


My point was that there is a way to have lock files and keep using pip (using pip-tools), without full switching to poetry or uv or anything else.

And uv's first release was just a drop-in replacement for pip-tools, just faster and less buggy.

I know. Maybe you should read the comment I replied to so you understand the context:

https://news.ycombinator.com/item?id=42415924

You will notice that it is about the ability to create lock files and nothing else.

And btw I've been using pip-tools for years and it's worked great for me.


Cohesion and simplicity of use.

uv init; uv add

No more pip freeze, source .venv/bin/activate, deactivate bullcrap.


You don't have to 'activate' anything if you don't want to. The bin/ directory inside your venv contain the binaries and scripted entrypoints for the packages installed in your virtualenv.

Yeah, and I don't "have" to use Make and build systems either, but it makes my life easier and bullshit free.

Why is it better if every command you use starts with `uv`? Why is it "bullcrap" the other way?

It’s not “uv” part that is important. It’s the focus on simplifying things you do thousands of times a week.

How does putting "uv" in front of the commands that you use simplify them?

uv init; uv add requests and you automatically get environment that can be easily shared between team members with predictable locking, with no “source .venv/bin/activate” bullshit.

Activating the venv only changes some environment variables. You can, as explained before, use the environment's Python executable directly instead. Activation does allow other tools to use the venv without having to know anything about it. The point is that you aren't then putting `uv run` in front of every command, because you don't need to have an integrator program that's aware of the venv being there.

If you had a bad experience with a different locking tool, sorry to hear it, but that has absolutely nothing to do with the venv itself nor its activation script.


> What does this offer?

Referring to the entire category in general: mainly, it offers to keep track of what you've installed, and help you set up reproducible scripts to install the same set of dependencies. Depending on the specific tool, additional functionality can vary widely. (Which is part of why there's no standard: there's no agreement on what the additional functionality should be.)

> why do people consider pip to be a problem?

Many problems with Pip are really problems with the underlying packaging standards. But Pip introduces a lot of its own problems as well:

* For years, everyone was expected to use a workflow whereby Pip is copied into each new venv (this is surprisingly slow - over 3 seconds on my machine); users have accidentally invented a million different ways (mainly platform-specific) for `pip` to refer to a different environment than `python` does, causing confusion. (For a while, Setuptools was also copied in by default and now it isn't any more, causing more confusion.) You don't need to do this - since 22.3, Pip has improved support for installing cross-environment, and IMX it Just Works - but people don't seem to know about it.

* Pip builds projects from sdists (thereby potentially running arbitrary code, before the user has had a chance to inspect anything - which is why this is much worse than the fact that the library itself is arbitrary code that you'll import and use later) with very little provocation. It even does this when you explicitly ask it just to download a package without installing it; and it does so in order to verify metadata (i.e., to check that building the sdist would result in an installable wheel with the right name and version). There's wide consensus that it doesn't really need to do this, but the internals aren't designed to make it an easy fix. I have an entire blog post in my planned pipeline about just this issue.

* Pip's algorithm for resolving package dependencies is thorough at the cost of speed. Depending on the kind of packages you use, it will often download multiple versions of a package as sdists, build them, check the resulting metadata, discover that this version isn't usable (either it doesn't satisfy something else's requirement, or its own requirements are incompatible) and try again. (This is partly due to how Python's import system, itself, works; you can't properly support multiple versions of the same library in the same environment, because the `import` syntax doesn't give you a clean way to specify which one you want.)

* Because of how the metadata works, you can't retroactively patch up your metadata for old published versions because e.g. you found out that you've been using something that's deprecated in the new Python release. This has especially bad interactions with the previous point in some cases and explaining it is beyond the scope of a post here; see for example https://iscinumpy.dev/post/bound-version-constraints/ for a proper explanation.

But the main reason why you'd use a package manager rather than directly working with Pip, is that Pip is only installing the packages, not, well, managing them. Pip has some record of which packages depend on which others, but it won't "garbage-collect" for you - if something was installed indirectly as a dependency, and then everything that depends on it is removed, the dependency is still there. Further, trying to upgrade stuff could, to my understanding, cause breakages if the dependency situation is complex enough. And above all of that, you're on your own for remembering why you installed any given thing into the current environment, or figuring out whether it's still needed. Which is important if you want to distribute your code, without expecting your users to recreate your entire environment (which might contain irrelevant things).


Is there any way to make wheels with pinned dependencies with this?

That's the main thing keeping me with Poetry.

Other than that I'm super excited.


Not yet! We're thinking about it but it's hard to do in a spec-compliant way.

What actually would be the hiccup there? Can't you just edit pyproject.toml and add the pinned transitive dependencies to [project.dependencies], and let a build backend take it from there? That's the extent of the pinning that the wheel format supports, anyway, as far as I understand.

Kind of hacktastic but I guess you could make an external tool that backs up pyproject.toml, sets it to have whatever is installed in the .venv as pinned dependencies, builds the wheel, then restored the old file.

Seems like the kind of thing that might have some kind of edge case somewhere I'm not thinking of at the moment though.


>Seems like the kind of thing that might have some kind of edge case somewhere I'm not thinking of at the moment though.

It wouldn't support hashes, or anything else that you can't do with ordinary dependency specifiers in the project.dependencies list (spec: https://peps.python.org/pep-0508/).


Does this still work if other devs on the project still use pyenv or other tools ?

If they are using standard compliant tools like pyenv, venv and pip then uv will play nicely with that. In fact I'm working on a project like that right now. The 'official' guidelines says to use pip and venv, but I'm using uv and no one else notices.

If on the other hand they're using a more optionated tool like poetry then it won't work as well, but that has more to do with poetry doing its own thing and not playing nice with other tools.


Yep. You can gradually start to use feature set of uv. Beside it uses standards of python project management, so you don’t have vendor lock-in here.

Honestly? It's excellent but it's Python only, I dream a day where NixOS/Guix Systems will be so common that anybody will use their approach to develop in any language (and the future Nix is something more digestible and Guix system will care more about the desktop, as side dreams)...

> poetry: 0.99 seconds

Good enough for me and sans VC risk.


uv is cool, no doubt—super fast, love the idea of managing Python versions + deps all in one place. But VC funding... Rust... unofficial Python builds... lots of red flags here if you think about the long-term impact on the ecosystem.

VC-backed tools always make me a little nervous. Sure, Astral says the tools will stay free, and I get that they're targeting enterprise for revenue (private package registries, etc.). But how many times have we heard this before? “Don’t be evil,” right? We’ve seen companies pivot to paywalls or watered-down open-source tools after funding dries up. What guarantees do we have here that uv won't eventually fall into that same trap?

Yeah, Rust is amazing (ruff is a beast), but it’s still a small subset of devs compared to Python. If Astral folds or just loses interest, how easy is this really going to be for the Python community to maintain? Forkable? Sure. But forkable ≠ maintainable when the dev pool is tiny.

Also, what's with using unofficial Python builds by default? Even on platforms where official builds exist (macOS/Windows)? I get that these standalone builds solve certain problems (like bootstrapping), but it feels like a risky shortcut. If those unofficial builds go unsupported or change directions, where does that leave "uv"? Why not at least give users the option to rely on official binaries?

And fragmentation... Python tooling is already a mess. Pip, poetry, conda, pyenv, rye, flit, etc.—do we really need another tool in the mix? Feels like every new tool just promises to "fix packaging forever" but ends up adding another layer of complexity. Why not contribute these improvements to existing tools like pip? Sure, innovation is great, but at what point does it become too much choice and not enough cohesion?

uv looks great, and I love the speed + features. But the ecosystem-level risks here... hard to ignore. Would love to see some stronger guarantees around open-source sustainability, community governance, and alignment with Python standards before jumping in fully. Otherwise, it’s just one more shiny tool that could end up abandoned or locked behind a paywall in 5 years.


> Also, what's with using unofficial Python builds by default? Even on platforms where official builds exist (macOS/Windows)? I get that these standalone builds solve certain problems (like bootstrapping), but it feels like a risky shortcut. If those unofficial builds go unsupported or change directions, where does that leave "uv"? Why not at least give users the option to rely on official binaries?

The official macOS binaries wouldn't work for uv because they're not portable. They're also taking ownership over the indygreg project which is a great thing for everyone since portable pythons are useful in a lot of contexts. I'm sure they'll start sending some of the portability changes upstream too once they get there.

I don't see how this is criticism.


> unofficial Python builds

You don't need to use those, works perfectly fine with whatever Python is on the machine.


I could see Astral's unofficial builds becoming official at some point after they've matured a bit more.

Since the topic of VC funding keeps coming up, I will quote myself from the blog post I did a few months ago [1]:

> there is an elephant in the room which is that Astral is a VC funded company. What does that mean for the future of these tools? Here is my take on this: for the community having someone pour money into it can create some challenges. For the PSF and the core Python project this is something that should be considered. However having seen the code and what uv is doing, even in the worst possible future this is a very forkable and maintainable thing. I believe that even in case Astral shuts down or were to do something incredibly dodgy licensing wise, the community would be better off than before uv existed.

[1]: https://lucumr.pocoo.org/2024/8/21/harvest-season/


I think there might be a reporting bias here? You only hear the negative news, i.e. when a vc-backed company does something bad. If they do something good, that's not news.

But uv is written in Rust so it means there are considerably more efforts to fork it as compared to Python tools

A lot of modern tooling in python is rust based such as pydantic - the skills to maintain this are already available

ruff is also from Astral.

Yeah my bad, good point, didn't realise! Two great tools from one company though :)

It depends much more on the quality of the code and how it’s structured. I feel pretty confident that you would find uv easier to maintain than either pip or poetry despite the first two being Python code.

How so?

I think they must mean that the number of rust developers would be small compared to the number of python developers. Perhaps? Not sure myself.

I think people need to appreciate that the number of developers interested in actually helping with free software maintenance is a subset of the number of developers. And when it comes to Python in particular, that subset is proportionally very small. That's just my anecdotal experience of similar projects in both ecosystems.

Numbers aren't everything.

Python has been around for a long time and there were some attempts at creating a modern package manager for it. If it were feasible to create uv in Python, it would have probably happened by now.


There's a barrier of entry for the army of people who use Python but are not (and often have no interest in being) engineers/developers/programmers.

This is the modern equivalent of building key components in Visual C++ for use by Visual Basic people; it kept the unwashed masses away from things they could break.

I think that this is probably the only way you're ever going to fix the horrific experience of using Python in anger. Which is a good thing, Python's a great little language which is let down by being built on sand.


Because it’s a tool for Python ecosystem.

UV creator's response on the concerns regarding VC money:

> I don't want to charge people money to use our tools, and I don't want to create an incentive structure whereby our open source offerings are competing with any commercial offerings (which is what you see with a lost of hosted-open-source-SaaS business models).

> What I want to do is build software that vertically integrates with our open source tools, and sell that software to companies that are already using Ruff, uv, etc. Alternatives to things that companies already pay for today.

> An example of what this might look like (we may not do this, but it's helpful to have a concrete example of the strategy) would be something like an enterprise-focused private package registry. A lot of big companies use uv. We spend time talking to them. They all spend money on private package registries, and have issues with them. We could build a private registry that integrates well with uv, and sell it to those companies. [...]

> But the core of what I want to do is this: build great tools, hopefully people like them, hopefully they grow, hopefully companies adopt them; then sell software to those companies that represents the natural next thing they need when building with Python. Hopefully we can build something better than the alternatives by playing well with our OSS, and hopefully we are the natural choice if they're already using our OSS.

https://hachyderm.io/@charliermarsh/113103564055291456


Prior art.

> Facebook's mission is to give people the power to build community and bring the world closer together.

> Our informal corporate motto is "Don't be evil." We Googlers generally relate those words to the way we serve our users – as well we should. But being "a different kind of company" means more than the products we make and the business we're building; it means making sure that our core values inform our conduct in all aspects of our lives as Google employees.

> OpenAi.


'I don't want to' is very different to 'I will never'

A VC-funded package and project manager for Python projects, written in Rust, and without involvement of the Python Packaging Authority (PyPA). If Python is too slow or otherwise not good enough to write a good Python package manager in, why use Python altogether?

Using Rust enables uv to install Python interpreters. That feature uses unofficial third-party binary builds. There are no official binary builds for Linux, but there are such builds for Windows and macOS, yet they use the unofficial third-party builds on those platforms as well. Still, I would consider package and project management to be a separate feature from Python management, and I’d leave it to a different tool (which may be the system package manager on some platforms).


> without involvement of the Python Packaging Authority (PyPA)

The PyPA does not really exist. The closest to an actual "authority" is the packaging forum on discuss.python.org and the astral folks are active there, see for instance the recent threads on the lockfile standard or the dynamic metadata issue.

> If Python is too slow or otherwise not good enough to write a good Python package manager in, why use Python altogether?

People use Python, Astral exists to solve problems of Python people. I'm not sure what the language they are using to write the tool in is relevant. numpy, scipy, tensorflow and many other cornerstones of Python are written in C, C++ and other non Python languages.

> but there are such builds for Windows and macOS

As the author of Rye where I tried very much the same I can attest that the macOS builds are useless for the uv usecase. It's great that Astral is now maintaining the standalone builds which have become a cornerstone for many Python users over the last three years. We have based our entire development environments on those, even though we have not adopted either rye or uv at our company.


I would argue Astral's products are in a substantively different category (developer tools) than numpy or scipy (numerical libraries), and the constraints on them are significantly different (e.g. like crypto code, you shouldn't roll your own numerical code, and instead use trustworthy libraries, which is what numpy/scipy wrap). Everything from their choice of language to which libraries to depend on (and which ones to make optional) to their installation system has been about being conservative and ensuring that the bootstrap process is as painless as possible for Python-only devs.

Astral to me seem to be targeting a small subset of the community, and their ignorance of the needs outside of it make me very wary of adopting their tools.


I primarily use Rust at my job (and love it) but I still write most of my scripts, benchmarking code etc in Python.

Python is a great scripting language and a great language to wrap other stuff. I have other projects in Python that wrap many packages, some in C++/Rust and working on Python is a lot more fun than most of these languages.

When I'm writing this sort of code, I don't have to `.into()`, derive send/sync/debug/whatever, I don't have to `.clone()` my strings when I pass them to multiple places that accept `String`, I don't have to do oh so many things that make my production/critical code better but I care about very little when I'm writing scripts.

I'm much more effective in my 80:20 in Python than in rust though admittedly I'm still on Poetry since uv is releasing vey quickly and my things work "well enough".

(In package management land - I still think most ecosystems haven't caught up to JavaScript + Rust and Python's module systems have weirdness (mod.rs/__init__.py which is unnecessary) but Cargo _is_ overall really nice).


I couldn't care less in which language it is written, as long as it solves my problems with Python. And uv absolutely does.

> Using Rust enables uv to install Python interpreters.

A Python workflow tool written in Python could equally well install every Python interpreter except for the one it itself is using. Those tools generally work fine with any recent installation, and getting one of those is normally not difficult. On Windows the official builds have standalone installers, and Linux normally comes with Python - an isolated development environment (that won't ever mess with the OS) is at most `apt install python3-pip` plus a `venv` invocation away.

But I'm equally well not going to reject a Python tool because it isn't written in Python. (Of course, if it requires installing a massive runtime that I don't already have, or if the compiled binary is larger than an entire Python source tarball - as I found to be the case with uv - then yes, that does count against it.)

> Still, I would consider package and project management to be a separate feature from Python management, and I’d leave it to a different tool

I do as well. I don't want an integrated "project management" tool, either. (I quite like the Unix philosophy for dev tasks.) But I do want an integrated Python application and library installer with a smooth experience, and Pipx isn't quite there yet.


> If Python is too slow or otherwise not good enough to write a good Python package manager in, why use Python altogether?

To write things that are not package managers? Are you suggesting not being the right tool for this job implies not being the right tool for any job?


> If Python is too slow or otherwise not good enough to write a good Python package manager in, why use Python altogether?

Not every language needs to be good at every thing. Pythons main advantage is inertia, not the things required for this kind of tooling.


PyPA has the involvement of Microsoft. It’s not a big deal, but unlike this it has the negative of GitHub being emphasized every time package publishing is brought up. I don’t see any negatives from having the choice of uv, and it’s a lot smaller of an entity than Microsoft.

> Using Rust enables uv to install Python interpreters.

It could be built with Python and ship with a standalone Python interpreter to bootstrap though.

More likely, it’s easier to write fast and correct Rust code.


Pixi is just better.

Pixi uses uv solver too but more focused on integrating with the conda packaging world



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: