Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I really hope astral can monetize without a highly destructive rugpull, because they are building great tools and solving real problems.




"pyx" is their first commercial offering: https://astral.sh/pyx

I agree though. Hope this is successful and they keep building awesome open-source tools.


We're paying for pyx. Wouldn't have if we didn't enjoy enjoy uv and ruff.

It's definitely a narrow path for them to tread. Feels like the best case is something like Hashicorp, great until the founders don't want to do it anymore.


> Feels like the best case is something like Hashicorp

Wow, that's probably my go-to case of things going south, not "best case scenario". They sold to IBM, a famous graveyard for software, and on the way there changed from FOSS licensing to their own proprietary ones for software the community started to rely on.


Why the “y” look so wrong in the special font.

Yeah their work thus far has been an incredible public service to the Python community.

Feels like they’re headed in the direction of bun.

In zero revenue or acquisition direction

Thankfully all these LLM labs are heavily invested in python so this seems like the likely route IMO

Just need to book a long nice walk with one of the CEOs

My issue with them is that they claim their tools replace existing tools, but they don't bother to actually replicate all of the functionality. So if you want to use the full functionality of existing tools, you need to fall back on them instead of using Astral's "replacements". It's like one step forward and one step back. For me personally, speed of the tooling is not as important as what the tooling can check, which is very important for a language like Python that is very easy to get wrong.

If there are specific incompatibilities or rough edges you're running into, we're always interested in hearing about them. We try pretty hard to provide a pip compatibility layer[1], but Python packaging is non-trivial and has a lot of layers and caveats.

[1]: https://docs.astral.sh/uv/pip/


Is there any plan for a non-“compatibility layer” way to do anything manual or nontrivial? uv sync and uv run are sort of fine for developing a distribution/package, but they’re not exactly replacements for anything else one might want to do with the pip and venv commands.

As a very basic example I ran into last week, Python tooling, even the nice Astral tooling, seems to be almost completely lacking any good detection of what source changes need to trigger what rebuild steps. Unless I’ve missed something, if I make a change to a source tree that uv sync doesn’t notice, I’m stuck with uv pip install -e ., which is a wee bit disappointing and feels a bit gross. I suppose I could try to put something correct into cache-keys, but this is fundamentally wrong. The list of files in my source tree that need to trigger a refresh is something that my build system determines when it builds. Maybe there should be a way to either plumb that into uv’s cache or to tell uv that at least “uv sync” should run the designated command to (incrementally) rebuild my source tree?

(Not that I can blame uv for failing to magically exfiltrate metadata from the black box that is hatchling plus its plugins.)


> Is there any plan for a non-“compatibility layer” way to do anything manual or nontrivial?

It's really helpful to have examples for this, like the one you provide below (which I'll respond to!). I've been a maintainer and contributor to the PyPA standard tooling for years, and once uv "clicked" for me I didn't find myself having to leave the imperative layer (of uv add/sync/etc) at all.

> As a very basic example I ran into last week, Python tooling, even the nice Astral tooling, seems to be almost completely lacking any good detection of what source changes need to trigger what rebuild steps.

Could you say more about your setup here? By "rebuild steps" I'm inferring you mean an editable install (versus a sdist/bdist build) -- in general `uv sync` should work in that scenario, including for non-trivial things where e.g. an extension build has to be re-run. In other words, if you do `uv sync` instead of `uv pip install -e .`, that should generally work.

However, to take a step back from that: IMO the nicer way to use uv is to not run `uv sync` that much. Instead, you can generally use `uv run ...` to auto-sync and run your development tooling within an environment than includes your editable installation.

By way of example, here's what I would traditionally do:

    python -m venv .env
    source .env/bin/activate
    python -m pip install -e .[dev] # editable install with the 'dev' extra
    pytest ...

    # re-run install if there are things a normal editable install can't transparently sync, like extension builds
Whereas with uv:

    uv run --dev pytest ... # uses pytest from the 'dev' dependency group
That single command does everything pip and venv would normally do to prep an editable environment and run pytest. It also works across re-runs, since it'll run `uv sync` as needed under the hood.

Their integration with existing tools seems to be generally pretty good.

For example, uv-build is rather lacking in any sort of features (and its documentation barely exists AFAICT, which is a bit disappointing), but uv works just fine with hatchling, using configuration mechanisms that predate uv.

(I spent some time last week porting a project from an old, entirely unsupportable build system to uv + hatchling, and I came out of it every bit as unimpressed by the general state of Python packaging as ever, but I had no real complaints about uv. It would be nice if there was a build system that could go even slightly off the beaten path without writing custom hooks and mostly inferring how they’re supposed to work, though. I’m pretty sure that even the major LLMs only know how to write a Python package configuration because they’ve trained on random blog posts and some GitHub packages that mostly work — they’re certainly not figuring anything out directly from the documentation, nor could they.)


Getting from 95% compatible to 100% compatible may not only take a lot of time, but also result in worsening the performance. Sometimes it's good to drop some off the less frequently used features in order to make the tool better (or allow for making the tool better)

Got any examples in mind?

Damn it, this unicorn farting rainbows and craping gold is not yet capable of towing another car. I don't know why they advertise it as a replacement for my current mode of transportation.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: